“‘tabular ML’ Tag”,2019-09-02 ():
![]()
Bibliography for tag
ai/tabular, most recent first: 1 related tag, 103 annotations, & 11 links (parent).
- See Also
- Gwern
- Links
- “Deep Learning Through A Telescoping Lens: A Simple Model Provides Empirical Insights On Grokking, Gradient Boosting & Beyond”, et al 2024
- “Probing the Decision Boundaries of In-Context Learning in Large Language Models”, et al 2024
- “Attention As an RNN”, et al 2024
- “The Harms of Class Imbalance Corrections for Machine Learning Based Prediction Models: a Simulation Study”, et al 2024
- “Many-Shot In-Context Learning”, et al 2024
- “From Words to Numbers: Your Large Language Model Is Secretly A Capable Regressor When Given In-Context Examples”, et al 2024
- “Chronos: Learning the Language of Time Series”, et al 2024
- “StructLM: Towards Building Generalist Models for Structured Knowledge Grounding”, et al 2024
- “Why Do Random Forests Work? Understanding Tree Ensembles As Self-Regularizing Adaptive Smoothers”, et al 2024
- “Illusory Generalizability of Clinical Prediction Models”
- “Attention versus Contrastive Learning of Tabular Data—A Data-Centric Benchmarking”, et al 2024
- “TabLib: A Dataset of 627M Tables With Context”, et al 2023
- “Unambiguous Discrimination of All 20 Proteinogenic Amino Acids and Their Modifications by Nanopore”, et al 2023d
- “Generating and Imputing Tabular Data via Diffusion and Flow-Based Gradient-Boosted Trees”, Jolicoeur- et al 2023
- “Generating Tabular Datasets under Differential Privacy”, 2023
- “TableGPT: Towards Unifying Tables, Nature Language and Commands into One GPT”, et al 2023
- “Language Models Are Weak Learners”, et al 2023
- “RGD: Stochastic Re-Weighted Gradient Descent via Distributionally Robust Optimization”, et al 2023
- “Large Language Models Are Few-Shot Health Learners”, et al 2023
- “Deep Learning Based Forecasting: a Case Study from the Online Fashion Industry”, et al 2023
- “Learning and Memorization”, 2023
- “Language Models Enable Simple Systems for Generating Structured Views of Heterogeneous Data Lakes”, et al 2023
- “TSMixer: An All-MLP Architecture for Time Series Forecasting”, et al 2023
- “Large Language Models Are Versatile Decomposers: Decompose Evidence and Questions for Table-Based Reasoning”, et al 2023
- “Fast Semi-Supervised Self-Training Algorithm Based on Data Editing”, et al 2023
- “Table-To-Text Generation and Pre-Training With TabT5”, et al 2022
- “Language Models Are Realistic Tabular Data Generators”, et al 2022
- “Forecasting With Trees”, et al 2022
- “Why Do Tree-Based Models Still Outperform Deep Learning on Tabular Data?”, et al 2022
- “Revisiting Pretraining Objectives for Tabular Deep Learning”, et al 2022
- “TabPFN: Meta-Learning a Real-Time Tabular AutoML Method For Small Data”, et al 2022
- “Transfer Learning With Deep Tabular Models”, et al 2022
- “Hopular: Modern Hopfield Networks for Tabular Data”, et al 2022
- “Predicting Romantic Interest during Early Relationship Development: A Preregistered Investigation Using Machine Learning”, et al 2022
- “On Embeddings for Numerical Features in Tabular Deep Learning”, et al 2022
- “To SMOTE, or Not to SMOTE?”, Elor & Averbuch-2022
- “M5 Accuracy Competition: Results, Findings, and Conclusions”, et al 2022
- “The GatedTabTransformer: An Enhanced Deep Learning Architecture for Tabular Modeling”, 2022
- “PFNs: Transformers Can Do Bayesian Inference”, et al 2021
- “DANets: Deep Abstract Networks for Tabular Data Classification and Regression”, et al 2021
- “Deep Neural Networks and Tabular Data: A Survey”, et al 2021
- “An Unsupervised Model for Identifying and Characterizing Dark Web Forums”, et al 2021
- “TAPEX: Table Pre-Training via Learning a Neural SQL Executor”, et al 2021
- “ARM-Net: Adaptive Relation Modeling Network for Structured Data”, et al 2021
- “Decision Tree Heuristics Can Fail, Even in the Smoothed Setting”, et al 2021
- “SCARF: Self-Supervised Contrastive Learning Using Random Feature Corruption”, et al 2021
- “Revisiting Deep Learning Models for Tabular Data”, et al 2021
- “The Epic Sepsis Model Falls Short—The Importance of External Validation”, An et al 2021
- “Well-Tuned Simple Nets Excel on Tabular Datasets”, et al 2021
- “Tabular Data: Deep Learning Is Not All You Need”, Shwartz-2021
- “Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning”, et al 2021
- “SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training”, et al 2021
- “Intelligence and General Psychopathology in the Vietnam Experience Study: A Closer Look”, 2021
- “Converting Tabular Data into Images for Deep Learning With Convolutional Neural Networks”, et al 2021
- “External Validation of a Widely Implemented Proprietary Sepsis Prediction Model in Hospitalized Patients”, et al 2021
- “Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting”, et al 2020
- “TabTransformer: Tabular Data Modeling Using Contextual Embeddings”, et al 2020
- “Engineering In-Place (Shared-Memory) Sorting Algorithms”, et al 2020
- “Kaggle Forecasting Competitions: An Overlooked Learning Opportunity”, 2020
- “TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data”, et al 2020
- “Neural Additive Models: Interpretable Machine Learning With Neural Nets”, et al 2020
- “TAPAS: Weakly Supervised Table Parsing via Pre-Training”, et al 2020
- “A Market in Dream: the Rapid Development of Anonymous Cybercrime”, et al 2020b
- “VIME: Extending the Success of Self-Supervised and Semi-Supervised Learning to Tabular Domain”, et al 2020
- “Fooling LIME and SHAP: Adversarial Attacks on Post Hoc Explanation Methods”, et al 2019
- “The Bouncer Problem: Challenges to Remote Explainability”, 2019
- “OHAC: Online Hierarchical Clustering Approximations”, et al 2019
- “LightGBM: A Highly Efficient Gradient Boosting Decision Tree”, et al 2019
- “TabNet: Attentive Interpretable Tabular Learning”, 2019
- “3D Human Pose Estimation via Human Structure-Aware Fully Connected Network”, et al 2019d
- “ID3 Learns Juntas for Smoothed Product Distributions”, et al 2019
- “Behavioral Patterns in Smartphone Usage Predict Big Five Personality Traits”, et al 2019
- “Asymptotic Learning Curves of Kernel Methods: Empirical Data versus Teacher-Student Paradigm”, et al 2019
- “N-BEATS: Neural Basis Expansion Analysis for Interpretable Time Series Forecasting”, et al 2019
- “SuperTML: Two-Dimensional Word Embedding for the Precognition on Structured Tabular Data”, et al 2019
- “Fairwashing: the Risk of Rationalization”, et al 2019
- “Tweedie Gradient Boosting for Extremely Unbalanced Zero-Inflated Data”, et al 2018
- “Neural Arithmetic Logic Units”, et al 2018
- “Large-Scale Comparison of Machine Learning Methods for Drug Target Prediction on ChEMBL”, et al 2018
- “Repurposing High-Throughput Image Assays Enables Biological Activity Prediction for Drug Discovery”, et al 2018
- “Improving Palliative Care With Deep Learning”, An et al 2018
- “Using Posters to Recommend Anime and Mangas in a Cold-Start Scenario”, et al 2017
- “Neural Collaborative Filtering”, et al 2017
- “OpenML Benchmarking Suites”, et al 2017
- “CatBoost: Unbiased Boosting With Categorical Features”, et al 2017
- “Resource-Efficient Machine Learning in 2 KB RAM for the Internet of Things”, et al 2017
- “XGBoost: A Scalable Tree Boosting System”, 2016
- “”Why Should I Trust You?”: Explaining the Predictions of Any Classifier”, et al 2016
- “The MovieLens Datasets: History and Context”, 2015
- “Planning As Satisfiability: Heuristics”, 2012
- “Leakage in Data Mining: Formulation, Detection, and Avoidance”, 2011
- “Random Survival Forests”, et al 2008
- “Tree Induction vs. Logistic Regression: A Learning-Curve Analysis”, et al 2003
- “A Survey of Methods for Scaling Up Inductive Algorithms”, 1999
- “On the Boosting Ability of Top-Down Decision Tree Learning Algorithms”, 1999
- “On The Effect of Data Set Size on Bias And Variance in Classification Learning”, 1999
- “The Effects of Training Set Size on Decision Tree Complexity”, 1997
- “Scaling up the Accuracy of Naive-Bayes Classifiers: a Decision-Tree Hybrid”, 1996
- “Stupid Data Miner Tricks: Overfitting the S&P 500”, 1995
- “The MONK’s Problems-A Performance Comparison of Different Learning Algorithms”, et al 1991
- “Symbolic and Neural Learning Algorithms: An Experimental Comparison”, et al 1991
- “Statistical Modeling: The Two Cultures”, 2024
- “How Good Are LLMs at Doing ML on an Unknown Dataset?”
- Wikipedia
- Miscellaneous
- Bibliography