/DL_Papers

Reading list for my interested deep learning topics

Reading List

Vision Transformers

[Dosovitskiy et al. 21] An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale, ICLR 2021.
[Touvron et al. 21] Training Data-efficient Image transformers & Distillation through Attention, ICML 2021.
[Liu et al. 21] Swin Transformer: Hierarchical Vision Transformer using Shifted Windows, ICCV 2021.
[Wu et al. 21] CvT: Introducing Convolutions to Vision Transformers, ICCV 2021.
[Dai et al. 21] CoAtNet: Marrying Convolution and Attnetion for All Data Sizes, NeurIPS 2021.
[Yang et al. 21] Focal Attention for Long-Range Interactions in Vision Transformers, NeurIPS 2021.
[El-Nouby et al. 21] XCiT: Cross-Covariance Image Transformers, NeurIPS 2021.
[Li et al. 22] MViTv2: Improved Multiscale Vision Transformers for Classification and Detection, CVPR 2022.
[Lee et al. 22] MPViT : Multi-Path Vision Transformer for Dense Prediction, CVPR 2022.
[Liu et al. 22]A ConvNet for the 2020s, CVPR 2022.

Neural Radiance Fields

[Mildenhall et al. 20] Representing Scenes as Neural Radiance Fields for View Synthesis, ECCV 2020.
[CH Lin et al. 21] BARF: Bundle-Adjusting Neural Raiance Fields, ICCV 2021.
[A Yi et al. 2021] pixelNeRF: Neural Radiance Fields from One or Few Images, CVPR 2021.
[P Hedman et al. 21] Baking Neural Randiance Fields for Real-Time View Synthesis. ICCV 2021.
[A Jain et al. 21] Putting NeRF on a Diet: Semantically Consistent Few-Shot View Synthesis, ICCV 2021.
[JT Barron et al. 21] Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields, CVPR 2021.
[R Martin=Brualla et al. 21] NeRF in the Wild: Neural Radiance Fields for Unconstrained Photo Collections, CVPR 2021.
[C Huang et al. 21] MetaSets: Meta-Learning on Point Sets for Generalizable Representations, CVPR 2021.
[SJ Garbin et al. 21] FastNeRF: High-Fidelity Neural Rendering at 200FPS, ICCV 2021.
[M Tancik et al. 21] Learned Initializations for Optimizing Coordinate-Based Neural Representations, CVPR 2021.
[A Yu et al. 21] PlenOctrees for Real-Time Rendering of Neural Radiance Fields, ICCV 2021.
[L Yen-Chen et al. 21] iNeRF: Inverting Neural Radiance Fields for Pose Estimation, IROS 2021.
[Z Wang et al. 21] NeRF--: Neural Radiance Fields Without Known Camera Parameters, CVPR 2021.
[Q Meng et al. 21] GNeRF: Gan-Based Neural Radiance Field without Posed Camera, ICCV 2021.
[J Li et al. 21] MINE: Towards Continuous Depth MPI with NeRF for Novel View Synthesis, ICCV 2021.
[Y Wei et al. 21] NerfingMVS: Guided Optimization of Neural Radiance Fields for Indoor Multi-view Stereo, ICCV 2021.
[T Müller et al. 22] Instant Neural Graphics Primitives with a Multiresolution Hash Encoding, SIGGRAPH 2022.


[Q Xu et al. 22] Point-NeRF: Point-based Neural Radiance Fields, CVPR 2022.
[S Fridovich-Keil et al. 22] Plenoxels: Radiance Fields without Neural Networks, CVPR 2022.

Self-Supervised Learning

[Dosovitskiy et al. 14] Discriminative Unsupervised Feature Learning with Convolutional Neural Networks, NIPS 2014.
[Pathak et al. 16] Context Encoders: Feature Learning by Inpainting, CVPR 2016.
[Norrozi and Favaro et al. 16] Unsupervised Learning of Visual Representations by Solving Jigsaw Puzzles, ECCV 2016.
[Gidaris et al. 18] Unsupervised Representation Learning by Predicting Image Rotations, ICLR 2018.
[He et al. 20] Momentum Contrast for Unsupervised Visual Representation Learning, CVPR 2020.
[Chen et al. 20] A Simple Framework for Contrastive Learning of Visual Representations, ICML 2020.
[Mikolov et al. 13] Efficient Estimation of Word Representations in Vector Space, ICLR 2013.
[Devlin et al. 19] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, NAACL 2019.
[Clark et al. 20] ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators, ICLR 2020.
[Hu et al. 20] Strategies for Pre-training Graph Neural Networks, ICLR 2020.
[Chen et al. 20] Generative Pretraining from Pixels, ICML 2020.
[Laskin et al. 20] CURL: Contrastive Unsupervised Representations for Reinforcement Learning, ICML 2020.
[Grill et al. 20] Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning, NeurIPS 2020.
[Chen et al. 20] Big Self-Supervised Models are Strong Semi-Supervised Learners, NeurIPS, 2020.
[Chen and He. 21] Exploring Simple Siamese Representation Learning, CVPR 2021.
[Tian et al. 21] Understanding Self-Supervised Learning Dynamics without Contrastive Pairs, ICML 2021.
[Caron et al. 21] Emerging Properties in Self-Supervised Vision Transformers, ICCV 2021.
[Liu et al. 22] Self-supervised Learning is More Robust to Dataset Imbalance, ICLR 2022.
[Bao et al. 22] BEiT: BERT Pre-Training of Image Transformers, ICLR 2022.
[He et al. 22] Masked Autoencoders are Scalable Vision Learners, CVPR 2022.
[Liu et al. 22] Improving Contrastive Learning with Model Augmetnation, arXiv preprint, 2022.
[Touvron et al. 22] DeIT III: Revenge of the VIT, arXiv preprint, 2022.

Graph Neural Networks

[Li et al. 16] Gated Graph Sequence Neural Networks, ICLR 2016.
[Hamilton et al. 17] Inductive Representation Learning on Large Graphs, NIPS 2017.
[Kipf and Welling 17] Semi-Supervised Classification with Graph Convolutional Networks, ICLR 2017.
[Velickovic et al. 18] Graph Attention Networks, ICLR 2018.
[Ying et al. 18] Hierarchical Graph Representation Learning with Differentiable Pooling, NeurIPS 2018.
[Xu et al. 19] How Powerful are Graph Neural Networks?, ICLR 2019.
[Maron et al. 19] Provably Powerful Graph Networks, NeurIPS 2019.
[Yun et al. 19] Graph Transformer Neteworks, NeurIPS 2019.
[Loukas 20] What Graph Neural Networks Cannot Learn: Depth vs Width, ICLR 2020.
[Bianchi et al. 20] Spectral Clustering with Graph Neural Networks for Graph Pooling, ICML 2020.
[Xhonneux et al. 20] Continuous Graph Neural Networks, ICML 2020.
[Garg et al. 20] Generalization and Representational Limits of Graph Neural Networks, ICML 2020.
[Baek et al. 21] Accurate Learning of Graph Representations with Graph Multiset Pooling, ICLR 2021.
[Liu et al. 21] Elastic Graph Neural Networks, ICML 2021.
[Li et al. 21] Training Graph Neural networks with 1000 Layers, ICML 2021.
[Jo et al. 21] Edge Representation Learning with Hypergraphs, NeurIPS 2021.
[Guo et al. 22] Data-Efficient Graph Grammar Learning for Molecular Generation, ICLR 2022.
[Geerts et al. 22] Expressiveness and Approximation Properties of Graph Neural Networks, ICLR 2022.
[Bevilacqua et al. 22] Equivariant Subgraph Aggregation Networks, ICLR 2022.
[Jo et al. 22] Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations, ICML 2022.
[Hoogeboom et al. 22] Equivariant Diffusion for Molecule Generation in 3D, ICML 2022.

Diffusion Models

[Song and Ermon 19] Generative Modeling by Estimating Gradients of the Data Distribution, NeurIPS 2019.
[Song and Ermon 20] Improved Techniques for Training Score-Based Generative Models, NeurIPS 2020.
[Ho et al. 20] Denoising Diffusion Probabilistic Models, NeurIPS 2020.
[Song et al. 21] Score-Based Generative Modeling through Stochastic Differential Equations, ICLR 2021.
[Nichol and Dhariwal 21] Improved Denoising Diffusion Probabilistic Models, ICML 2021.
[Vahdat et al. 21] Score-based Generative Modeling in Latent Space, NeurIPS 2021.
[Dhariwal and Nichol 21] Diffusion Models Beat GANs on Image Synthesis, NeureIPS 2021.
[De Bortoli et al. 22] Diffusion Schrodinger Bridge with Application to Score-Based Generative Modeling, NeurIPS 2021.
[Ho and Salimans 22] Classifier-Free Diffusion Guidance, arXiv preprint, 2022.
[Dockhorn et al. 22] Score-Based Generative Modeling with Critically-Damped Langevin Diffusion, ICLR 2022.
[Salimans and Ho 22] Progressive Distillation for Fast Sampling of Diffusion Models, ICLR 2022.
[Chen et al. 22] Likelihood Training of Schrodinger Bridge using Forward-Backwrad SDEs Theory, ICLR 2022.