This repository contains the frontier research on self-supervised learning for tabular data which is a popular topic recently.
This list is maintained by Wei-Wei Du. (Actively keep updating)
- VIME: Extending the Success of Self- and Semi-supervised Learning to Tabular Domain (NeurIPS'20)
- CORE: Self- and Semi-supervised Tabular Learning with COnditional REgularizations (NeurIPS'21)
- SubTab: Subsetting Features of Tabular Data for Self-Supervised Representation Learning(NeurIPS'21)
- SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption (ICLR'22 Spotlight)
- STab: Self-supervised Learning for Tabular Data (NeurIPS'22 Workshop on Table Representation Learning)
- SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training (NurIPS‘22 Workshop on Table Representation Learning)
- TabNet: Attentive Interpretable Tabular Learning (AAAI'21)
- TransTab: Learning Transferable Tabular Transformers Across Tables (NeurIPS'22)
- Self-Supervision Enhanced Feature Selection with Correlated Gates (ICLR'22)
- Local Contrastive Feature Learning for Tabular Data (CIKM'22)