The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
☆460Dec 6, 2021Updated 4 years ago
Alternatives and similar repositories for saint
Users that are interested in saint are comparing it to the libraries listed below
Sorting:
- Research on Tabular Deep Learning: Papers & Packages☆1,110Nov 13, 2024Updated last year
- Example of using the SAINT architecture in fastai☆12Aug 25, 2021Updated 4 years ago
- ☆24Jan 27, 2022Updated 4 years ago
- A unified framework for Deep Learning Models on tabular data☆1,636Feb 16, 2026Updated 2 weeks ago
- Codebase for VIME: Extending the Success of Self- and Semi-supervised Learning to Tabular Domain - NeurIPS 2020☆155Oct 26, 2020Updated 5 years ago
- PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf☆2,907Oct 23, 2024Updated last year
- The official implementation of the paper, "SubTab: Subsetting Features of Tabular Data for Self-Supervised Representation Learning"☆151Jul 1, 2022Updated 3 years ago
- TabNet for fastai☆124Feb 20, 2026Updated last week
- A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch☆1,401Sep 27, 2025Updated 5 months ago
- A repo for transfer learning with deep tabular models☆104Feb 15, 2023Updated 3 years ago
- ☆26Dec 14, 2021Updated 4 years ago
- Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)☆357Sep 17, 2025Updated 5 months ago
- Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data☆520Jan 21, 2021Updated 5 years ago
- ☆16Apr 30, 2022Updated 3 years ago
- (NeurIPS 2022) On Embeddings for Numerical Features in Tabular Deep Learning☆406Apr 16, 2025Updated 10 months ago
- Code for Active Learning at The ImageNet Scale. This repository implements many popular active learning algorithms and allows training wi…☆54Nov 29, 2021Updated 4 years ago
- Unofficial Pytorch implementation of SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pretraining https…☆30Nov 20, 2023Updated 2 years ago
- Boosted neural network for tabular data☆217Jul 25, 2024Updated last year
- ☆19May 15, 2023Updated 2 years ago
- Experiments on Tabular Data Models☆280May 25, 2023Updated 2 years ago
- Implementation of SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption in Pytorch, a model learning a representati…☆93Mar 17, 2024Updated last year
- ☆54Sep 11, 2021Updated 4 years ago
- (NeurIPS 2021) Revisiting Deep Learning Models for Tabular Data☆321Nov 12, 2024Updated last year
- ☆505Aug 18, 2024Updated last year
- [NeurIPS 2021] Well-tuned Simple Nets Excel on Tabular Datasets☆88Feb 28, 2023Updated 3 years ago
- The official codebase for Unsupervised Anomaly Detection with Adversarial Mirrored AutoEncoders paper (UAI'21).☆16Jun 8, 2021Updated 4 years ago
- Pytorch ImageNet1k Loader with Bounding Boxes.☆13Jan 23, 2022Updated 4 years ago
- Hopular: Modern Hopfield Networks for Tabular Data☆315Jun 2, 2022Updated 3 years ago
- DeltaPy - Tabular Data Augmentation (by @firmai)☆556Sep 19, 2023Updated 2 years ago
- The implementation of "TabR: Unlocking the Power of Retrieval-Augmented Tabular Deep Learning"☆322Nov 17, 2025Updated 3 months ago
- stand alone Neural Additive Models, forked from google-reasearch for easy import to colab☆29Sep 29, 2020Updated 5 years ago
- A centralized place for deep thinking code and experiments☆90Aug 9, 2023Updated 2 years ago
- A few baselines with a standard tabular model☆38Jun 3, 2020Updated 5 years ago
- ☆69Feb 17, 2024Updated 2 years ago
- Minimal fastai code needed for working with pytorch☆15Aug 25, 2021Updated 4 years ago
- Build fast gradio demos of fastai learners☆35Sep 23, 2021Updated 4 years ago
- Code for "TabZilla: When Do Neural Nets Outperform Boosted Trees on Tabular Data?"☆178Mar 22, 2024Updated last year
- NeurIPS'22 | TransTab: Learning Transferable Tabular Transformers Across Tables☆213Mar 13, 2025Updated 11 months ago
- ☆352Mar 1, 2021Updated 5 years ago