I categorize, annotate and write comments for all research papers I read as a PhD student.
- All Papers
- Uncertainty Estimation
- Out-of-Distribution Detection
- Theoretical Properties of Deep Learning
- VAEs
- Normalizing Flows
- Autonomous Driving
- Medical ML
- Object Detection
- 3D Object Detection
- 3D Multi-Object Tracking
- 3D Human Pose Estimation
- Visual Tracking
- Sequence Modeling
- Reinforcement Learning
- System Identification
- Energy-Based Models
- Neural Processes
- Neural ODEs
- Transformers
- Implicit Neural Representations
- SysCon Deep Learning Reading Group
- SysCon Monte Carlo Reading Group
- Papers by Year
- NeurIPS
- ICML
- ICLR
- CVPR
- ECCV
- ICCV
- BMVC
- AISTATS
- AAAI
- MICCAI
- CDC
- JMLR
- Transformers Can Do Bayesian Inference [pdf] [code] [annotated pdf]
- Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
2021-12-20, ICLR 2022
- [Transformers]
Quite interesting and well-written paper. I did however find it difficult to properly understand everything, it feels like a lot of details are omitted (I wouldn't really know how to actually implement this in practice). It's difficult for me to judge how impressive the results are or how practically useful this approach actually might be, what limitations are there? Overall though, it does indeed seem quite interesting.
- A Deep Bayesian Neural Network for Cardiac Arrhythmia Classification with Rejection from ECG Recordings [pdf] [code] [annotated pdf]
- Wenrui Zhang, Xinxin Di, Guodong Wei, Shijia Geng, Zhaoji Fu, Shenda Hong
2022-02-26
- [Uncertainty Estimation] [Medical ML]
Somewhat interesting paper. They use a softmax model with MC-dropout to compute uncertainty estimates. The evaluation is not very extensive, they mostly just check that the classification accuracy improves as they reject more and more samples based on a uncertainty threshold.
- Out of Distribution Data Detection Using Dropout Bayesian Neural Networks [pdf] [annotated pdf]
- Andre T. Nguyen, Fred Lu, Gary Lopez Munoz, Edward Raff, Charles Nicholas, James Holt
2022-02-18, AAAI 2022
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. It seemed quite niche at first, but I think their analysis could potentially be useful.
- Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks [pdf] [code] [annotated pdf]
- Shiyu Liang, Yixuan Li, R. Srikant
2017-06-08, ICLR 2018
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. Two simple modifications of the "maximum softmax score" baseline, and the performance is consistently improved. The input perturbation method is quite interesting. Intuitively, it's not entirely clear to me why it actually works.
- Confidence-based Out-of-Distribution Detection: A Comparative Study and Analysis [pdf] [code] [annotated pdf]
- Christoph Berger, Magdalini Paschali, Ben Glocker, Konstantinos Kamnitsas
2021-07-06, MICCAI Workshops 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and well-written paper. Interesting that Mahalanobis works very well on the CIFAR10 vs SVHN but not on the medical imaging dataset. I don't quite get how/why the ODIN method works, I'll probably have to read that paper.
- Deep Learning Through the Lens of Example Difficulty [pdf] [annotated pdf]
- Robert John Nicholas Baldock, Hartmut Maennel, Behnam Neyshabur
2021-05-21, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite interesting and well-written paper. The definition of "prediction depth" in Section 2.1 makes sense, and it definitely seems reasonable that this could correlate with example difficulty / prediction confidence in some way. Section 3 and 4, and all the figures, contain a lot of info it seems, I'd probably need to read the paper again to properly understand/appreciate everything.
- UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography [pdf] [annotated pdf]
- Francisca Vasconcelos, Bobby He, Nalini Singh, Yee Whye Teh
2022-02-22
- [Implicit Neural Representations] [Uncertainty Estimation] [Medical ML]
Interesting and well-written paper. I wasn't very familiar with CT image reconstruction, but they do a good job explaining everything. Interesting that MC-dropout seems important for getting well-calibrated predictions.
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Robust Uncertainty Estimates with Out-of-Distribution Pseudo-Inputs Training [pdf] [annotated pdf]
- Pierre Segonne, Yevgen Zainchkovskyy, Søren Hauberg
2022-01-15
- [Uncertainty Estimation]
Somewhat interesting paper. I didn't quite understand everything, so it could be more interesting than I think. The fact that their pseudo-input generation process "relies on the availability of a differentiable density estimate of the data" seems like a big limitation? For regression, they only applied their method to very low-dimensional input data (1D toy regression and UCI benchmarks), but would this work for image-based tasks?
- Contrastive Training for Improved Out-of-Distribution Detection [pdf] [annotated pdf]
- Jim Winkens, Rudy Bunel, Abhijit Guha Roy, Robert Stanforth, Vivek Natarajan, Joseph R. Ledsam, Patricia MacWilliams, Pushmeet Kohli, Alan Karthikesalingam, Simon Kohl, Taylan Cemgil, S. M. Ali Eslami, Olaf Ronneberger
2020-07-10
- [Out-of-Distribution Detection]
Quite interesting and very well-written paper. They take the method from the Mahalanobis paper ("A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks") (however, they fit Gaussians only to the features at the second-to-last network layer, and they don't use the input pre-processing either) and consistently improve OOD detection performance by incorporating contrastive training. Specifically, they first train the network using just the SimCLR loss for a large number of epochs, and then also add the standard classification loss. I didn't quite get why the label smoothing is necessary, but according to Table 2 it's responsible for a large portion of the performance gain.
- A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks [pdf] [code] [annotated pdf]
- Kimin Lee, Kibok Lee, Honglak Lee, Jinwoo Shin
2018-07-10, NeurIPS 2018
- [Out-of-Distribution Detection]
Well-written and interesting paper. The proposed method is simple and really neat: fit class-conditional Gaussians in the feature space of a pre-trained classifier (basically just LDA on the feature vectors), and then use the Mahalanobis distance to these Gaussians as the confidence score for input x. They then also do this for the features at multiple levels of the network and combine these confidence scores into one. I don't quite get why the "input pre-processing" in Section 2.2 (adding noise to test samples) works, in Table 1 it significantly improves the performance.
- Noise Contrastive Priors for Functional Uncertainty [pdf] [code] [annotated pdf]
- Danijar Hafner, Dustin Tran, Timothy Lillicrap, Alex Irpan, James Davidson
2018-07-24, UAI 2019
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Only experiments on a toy 1D regression problem, and flight delay prediction in which the input is 8D. The approach of just adding noise to the input x to get OOD samples would probably not work very well e.g. for image-based problems?
- Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions [pdf] [annotated pdf]
- Abhijit Guha Roy, Jie Ren, Shekoofeh Azizi, Aaron Loh, Vivek Natarajan, Basil Mustafa, Nick Pawlowski, Jan Freyberg, Yuan Liu, Zach Beaver, Nam Vo, Peggy Bui, Samantha Winter, Patricia MacWilliams, Greg S. Corrado, Umesh Telang, Yun Liu, Taylan Cemgil, Alan Karthikesalingam, Balaji Lakshminarayanan, Jim Winkens
2021-04-08, Medical Image Analysis (January 2022)
- [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. Quite long, so it took a bit longer than usual to read it. Section 1 and 2 gives a great overview of OOD detection in general, and how it can be used specifically in this dermatology setting. I can definitely recommend reading Section 2 (Related work). They assume access to some outlier data during training, so their approach is similar to the "Outlier exposure" method (specifically in this dermatology setting, they say that this is a fair assumption). Their method is an improvement of the "reject bucket" (add an extra class which you assign to all outlier training data points), in their proposed method they also use fine-grained classification of the outlier skin conditions. Then they also use an ensemble of 5 models, and also a more diverse ensemble (in which they combine models trained with different representation learning techniques). This diverse ensemble obtains the best performance.
- Being a Bit Frequentist Improves Bayesian Neural Networks [pdf] [code] [annotated pdf]
- Agustinus Kristiadi, Matthias Hein, Philipp Hennig
2021-06-18, AISTATS 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method makes intuitive sense, trying to incorporate the "OOD training" method (i.e., to use some kind of OOD data during training, similar to e.g. the "Deep Anomaly Detection with Outlier Exposure" paper) into the Bayesian deep learning approach. The experimental results do seem quite promising.
- Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning [pdf] [code] [annotated pdf]
- Runa Eschenhagen, Erik Daxberger, Philipp Hennig, Agustinus Kristiadi
2021-11-95, NeurIPS Workshops 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. Short paper of just 3 pages, but with an extensive appendix which I definitely recommend going through. The method, training an ensemble and then applying the Laplace approximation to each network, is very simple and intuitively makes a lot of sense. I didn't realize that this would have basically the same test-time speed as ensembling (since they utilize that probit approximation), that's very neat. It also seems to consistently outperform ensembling a bit across almost all tasks and metrics.
- Pessimistic Bootstrapping for Uncertainty-Driven Offline Reinforcement Learning [pdf] [annotated pdf]
- Chenjia Bai, Lingxiao Wang, Zhuoran Yang, Zhi-Hong Deng, Animesh Garg, Peng Liu, Zhaoran Wang
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with RL, which makes it a bit difficult for me to properly evaluate the paper's contributions. They use standard ensembles for uncertainty estimation combined with an OOD sampling regularization. I thought that the OOD sampling could be interesting, but it seems very specific to RL. I'm sure this paper is quite interesting for people doing RL, but I don't think it's overly useful for me.
- On the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks [pdf] [code] [annotated pdf]
- Maximilian Seitzer, Arash Tavakoli, Dimitrije Antic, Georg Martius
2021-09-29, ICLR 2022
- [Uncertainty Estimation]
Quite interesting and very well-written paper, I enjoyed reading it. Their analysis of fitting Gaussian regression models via the NLL is quite interesting, I didn't really expect to learn something new about this. I've seen Gaussian models outperform standard regression (L2 loss) w.r.t. accuracy in some applications/datasets, and it being the other way around in others. In the first case, I've then attributed the success of the Gaussian model to the "learned loss attenuation". The analysis in this paper could perhaps explain why you get this performance boost only in certain applications. Their beta-NLL loss could probably be quite useful, seems like a convenient tool to have.
- Sample Efficient Deep Reinforcement Learning via Uncertainty Estimation [pdf] [annotated pdf]
- Vincent Mai, Kaustubh Mani, Liam Paull
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with reinforcement learning, which makes it a bit difficult for me to properly evaluate the paper's contributions, but to me it seems like fairly straightforward method modifications? To use ensembles of Gaussian models (instead of ensembles of models trained using the L2 loss) makes sense. The BIV method I didn't quite get, it seems rather ad hoc? I also don't quite get exactly how it's used in equation (10), is the ensemble of Gaussian models trained _jointly_ using this loss? I don't really know if this could be useful outside of RL.
- Laplace Redux -- Effortless Bayesian Deep Learning [pdf] [code] [annotated pdf]
- Erik Daxberger, Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Matthias Bauer, Philipp Hennig
2021-06-28, NeurIPS 2021
- [Uncertainty Estimation]
Interesting and very well-written paper, I enjoyed reading it. I still think that ensembling probably is quite difficult to beat purely in terms of uncertainty estimation quality, but this definitely seems like a useful tool in many situations. It's not clear to me if the analytical expression for regression in "4. Approximate Predictive Distribution" is applicable also if the variance is input-dependent?
- Benchmarking Uncertainty Quantification on Biosignal Classification Tasks under Dataset Shift [pdf] [annotated pdf]
- Tong Xia, Jing Han, Cecilia Mascolo
2021-12-16, AAAI Workshops 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. They synthetically create dataset shifts (e.g. by adding Gaussian noise to the data) of increasing intensity and study whether or not the uncertainty increases as the accuracy degrades. They compare regular softmax, temperature scaling, MC-dropout, ensembling and a simple variational inference method. Their conclusion is basically that ensembling slightly outperforms the other methods, but that no method performs overly well. I think these type of studies are really useful.
- Deep Evidential Regression [pdf] [code] [annotated pdf]
- Alexander Amini, Wilko Schwarting, Ava Soleimany, Daniela Rus
2019-10-07, NeurIPS 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. This is a good paper to read before "Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions". Their proposed method seems to have similar / slightly worse performance than a small ensemble, so the only real advantage is that it's faster at time-time? This is of course very important in many applications, but not in all. The performance also seems quite sensitive to the choice of lambda in the combined loss function (Equation (10)), according to Figure S2 in the appendix?
- On Out-of-distribution Detection with Energy-based Models [pdf] [code] [annotated pdf]
- Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
2021-07-03, ICML Workshops 2021
- [Out-of-Distribution Detection] [Energy-Based Models]
Well-written and quite interesting paper. A short paper, just 4 pages. They don't study the method from the "Energy-based Out-of-distribution Detection" paper as I had expected, but it was still a quite interesting read. The results in Section 4.2 seem interesting, especially for experiment 3, but I'm not sure that I properly understand everything.
- Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions [pdf] [annotated pdf]
- Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. I didn't quite understand all the details, I'll have to read a couple of related/background papers to be able to properly appreciate and evaluate the proposed method. I definitely feel like I would like to read up on this family of methods. Extensive experimental evaluation, and the results seem promising overall.
- Energy-based Out-of-distribution Detection [pdf] [code] [annotated pdf]
- Weitang Liu, Xiaoyun Wang, John D. Owens, Yixuan Li
2020-10-08, NeurIPS 2020
- [Out-of-Distribution Detection] [Energy-Based Models]
Interesting and well-written paper. The proposed method is quite clearly explained and makes intuitive sense (at least if you're familiar with EBMs). Compared to using the softmax score, the performance does seem to improve consistently. Seems like fine-tuning on an "auxiliary outlier dataset" is required to get really good performance though, which you can't really assume to have access to in real-world problems, I suppose?
- VOS: Learning What You Don't Know by Virtual Outlier Synthesis [pdf] [code] [annotated pdf]
- Xuefeng Du, Zhaoning Wang, Mu Cai, Yixuan Li
2022-02-02, ICLR 2022
- [Out-of-Distribution Detection]
Interesting and quite well-written paper. I did find it somewhat difficult to understand certain parts though, they could perhaps be explained more clearly. The results seem quite impressive (they do consistently outperform all baselines), but I find it interesting that the "Gaussian noise" baseline in Table 2 performs that well? I should probably have read "Energy-based Out-of-distribution Detection" before reading this paper.
- Efficiently Modeling Long Sequences with Structured State Spaces [pdf] [code] [annotated pdf]
- Albert Gu, Karan Goel, Christopher Ré
2021-10-31, ICLR 2022
- [Sequence Modeling]
Very interesting and quite well-written paper. Kind of neat/fun to see state-space models being used. The experimental results seem very impressive!? I didn't fully understand everything in Section 3. I had to read Section 3.4 a couple of times to understand how the parameterization actually works in practice (you have H state-space models, one for each feature dimension, so that you can map a sequence of feature vectors to another sequence of feature vectors) (and you can then also have multiple such layers of state-space models, mapping sequence --> sequence --> sequence --> ....).
- Periodic Activation Functions Induce Stationarity [pdf] [code] [annotated pdf]
- Lassi Meronen, Martin Trapp, Arno Solin
2021-10-26, NeurIPS 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Quite a heavy read, probably need to be rather familiar with GPs to properly understand/appreciate everything. Definitely check Appendix D, it gives a better understanding of how the proposed method is applied in practice. I'm not quite sure how strong/impressive the experimental results actually are. Also seems like the method could be a bit inconvenient to implement/use?
- Reliable and Trustworthy Machine Learning for Health Using Dataset Shift Detection [pdf] [annotated pdf]
- Chunjong Park, Anas Awadalla, Tadayoshi Kohno, Shwetak Patel
2021-10-26, NeurIPS 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and very well-written paper. Gives a good overview of the field and contains a lot of seemingly useful references. The evaluation is very comprehensive. The user study is quite neat.
- An Information-theoretic Approach to Distribution Shifts [pdf] [code] [annotated pdf]
- Marco Federici, Ryota Tomioka, Patrick Forré
2021-06-07, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite well-written paper overall that seemed interesting, but I found it very difficult to properly understand everything. Thus, I can't really tell how interesting/significant their analysis actually is.
- On the Importance of Gradients for Detecting Distributional Shifts in the Wild [pdf] [code] [annotated pdf]
- Rui Huang, Andrew Geng, Yixuan Li
2021-10-01, NeurIPS 2021
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. The experimental results do seem promising. However, I don't quite get why the proposed method intuitively makes sense, why is it better to only use the parameters of the final network layer?
- Masked Autoencoders Are Scalable Vision Learners [pdf] [annotated pdf]
- Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick
2021-11-11
Interesting and well-written paper. The proposed method is simple and makes a lot of intuitive sense, which is rather satisfying. After page 4, there's mostly just detailed ablations and results.
- Transferring Inductive Biases through Knowledge Distillation [pdf] [code] [annotated pdf]
- Samira Abnar, Mostafa Dehghani, Willem Zuidema
2020-05-31
- [Theoretical Properties of Deep Learning]
Quite well-written and somewhat interesting paper. I'm not very familiar with this area. I didn't spend too much time trying to properly evaluate the significance of the findings.
- Deep Classifiers with Label Noise Modeling and Distance Awareness [pdf] [annotated pdf]
- Vincent Fortuin, Mark Collier, Florian Wenzel, James Allingham, Jeremiah Liu, Dustin Tran, Balaji Lakshminarayanan, Jesse Berent, Rodolphe Jenatton, Effrosyni Kokiopoulou
2021-10-06
- [Uncertainty Estimation]
Quite interesting and well-written paper. I find the distance-awareness property more interesting than modelling of input/class-dependent label noise, so the proposed method (HetSNGP) is perhaps not overly interesting compared to the SNGP baseline.
- Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets [pdf] [code] [annotated pdf]
- Alethea Power, Yuri Burda, Harri Edwards, Igor Babuschkin, Vedant Misra
2021-05, ICLR Workshops 2021
- [Theoretical Properties of Deep Learning]
Somewhat interesting paper. The phenomena observed in Figure 1, that validation accuracy suddenly increases long after almost perfect fitting of the training data has been achieved is quite interesting. I didn't quite understand the datasets they use (binary operation tables).
- Learning to Simulate Complex Physics with Graph Networks [pdf] [code] [annotated pdf]
- Alvaro Sanchez-Gonzalez, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, Peter W. Battaglia
2020-02-21, ICML 2020
Quite well-written and somewhat interesting paper. Cool application and a bunch of neat videos. This is not really my area, so I didn't spend too much time/energy trying to fully understand everything.
- Neural Unsigned Distance Fields for Implicit Function Learning [pdf] [code] [annotated pdf]
- Julian Chibane, Aymen Mir, Gerard Pons-Moll
2020-10-26, NeurIPS 2020
- [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it! The paper also gives a good understanding of neural implicit representations in general.
- Probabilistic 3D Human Shape and Pose Estimation from Multiple Unconstrained Images in the Wild [pdf] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-03-19, CVPR 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they predict a single Gaussian distribution for the pose (instead of hierarchical matrix-Fisher distributions). Also, they mainly focus on the body shape. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Synthetic Training for Accurate 3D Human Pose and Shape Estimation in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2020-09-21, BMVC 2020
- [3D Human Pose Estimation]
Well-written and farily interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they just use direct regression. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Learning Motion Priors for 4D Human Body Capture in 3D Scenes [pdf] [code] [annotated pdf]
- Siwei Zhang, Yan Zhang, Federica Bogo, Marc Pollefeys, Siyu Tang
2021-08-23, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I didn't fully understand everything though, and it feels like I probably don't know this specific setting/problem well enough to fully appreciate the paper.
- Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-10-03, ICCV 2021
- [3D Human Pose Estimation]
Well-written and very interesting paper, I enjoyed reading it. The hierarchical distribution prediction approach makes sense and consistently outperforms the independent baseline. Using matrix-Fisher distributions makes sense. The synthetic training framework and the input representation of edge-filters + 2D keypoint heatmaps are both interesting.
- SMD-Nets: Stereo Mixture Density Networks [pdf] [code] [annotated pdf]
- Fabio Tosi, Yiyi Liao, Carolin Schmitt, Andreas Geiger
2021-04-08, CVPR 2021
- [Uncertainty Estimation]
Well-written and interesting paper. Quite easy to read and follow, the method is clearly explained and makes intuitive sense.
- We are More than Our Joints: Predicting how 3D Bodies Move [pdf] [code] [annotated pdf]
- Yan Zhang, Michael J. Black, Siyu Tang
2020-12-01, CVPR 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. The marker-based representation, instead of using skeleton joints, makes sense. The recursive projection scheme also makes sense, but seems very slow (2.27 sec/frame)? I didn't quite get all the details for their DCT representation of the latent space.
- imGHUM: Implicit Generative Models of 3D Human Shape and Articulated Pose [pdf] [code] [annotated pdf]
- Thiemo Alldieck, Hongyi Xu, Cristian Sminchisescu
2021-08-24, ICCV 2021
- [3D Human Pose Estimation] [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it. Interesting combination of implicit representations and 3D human modelling. The "inclusive human modelling" application is neat and important.
- DI-Fusion: Online Implicit 3D Reconstruction with Deep Priors [pdf] [code] [annotated pdf]
- Jiahui Huang, Shi-Sheng Huang, Haoxuan Song, Shi-Min Hu
2020-12-10, CVPR 2021
- [Implicit Neural Representations]
Well-written and interesting paper, I enjoyed reading it. Neat application of implicit representations. The paper also gives a quite good overview of online 3D reconstruction in general.
- Contextually Plausible and Diverse 3D Human Motion Prediction [pdf] [annotated pdf]
- Sadegh Aliakbarian, Fatemeh Sadat Saleh, Lars Petersson, Stephen Gould, Mathieu Salzmann
2019-12-18, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The main idea, using a learned conditional prior p(z|c) instead of just p(z), makes sense and was shown beneficial also in "HuMoR: 3D Human Motion Model for Robust Pose Estimation". I'm however somewhat confused by their specific implementation in Section 4, doesn't seem like a standard cVAE implementation?
- Local Implicit Grid Representations for 3D Scenes [pdf] [code] [annotated pdf]
- Chiyu Max Jiang, Avneesh Sud, Ameesh Makadia, Jingwei Huang, Matthias Nießner, Thomas Funkhouser
2020-03-19, CVPR 2020
- [Implicit Neural Representations]
Well-written and quite interesting paper. Interesting application, being able to reconstruct full 3D scenes from sparse point clouds. I didn't fully understand everything, as I don't have a particularly strong graphics background.
- Information Dropout: Learning Optimal Representations Through Noisy Computation [pdf] [annotated pdf]
- Alessandro Achille, Stefano Soatto
2016-11-04
Well-written and somewhat interesting paper overall. I'm not overly familiar with the topics of the paper, and didn't fully understand everything. Some results and insights seem quite interesting/neat, but I'm not sure exactly what the main takeaways should be, or how significant they actually are.
- Encoder-decoder with Multi-level Attention for 3D Human Shape and Pose Estimation [pdf] [code] [annotated pdf]
- Ziniu Wan, Zhengjia Li, Maoqing Tian, Jianbo Liu, Shuai Yi, Hongsheng Li
2021-09-06, ICCV 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. Quite a lot of details on the attention architecture, which I personally don't find overly interesting. The experimental results are quite impressive, but I would like to see a comparison in terms of computational cost at test-time. It sounds like their method is rather slow.
- Physics-based Human Motion Estimation and Synthesis from Videos [pdf] [annotated pdf]
- Kevin Xie, Tingwu Wang, Umar Iqbal, Yunrong Guo, Sanja Fidler, Florian Shkurti
2021-09-21, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The general idea, refining frame-by-frame pose estimates via physical constraints, intuitively makes a lot of sense. I did however find it quite difficult to understand all the details in Section 3.
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Human Pose Regression with Residual Log-likelihood Estimation [pdf] [code] [annotated pdf]
- Jiefeng Li, Siyuan Bian, Ailing Zeng, Can Wang, Bo Pang, Wentao Liu, Cewu Lu
2021-07-23, ICCV 2021
- [3D Human Pose Estimation]
Quite interesting paper, but also quite strange/confusing. I don't think the proposed method is explained particularly well, at least I found it quite difficult to properly understand what they actually are doing.
In the end it seems like they are learning a global loss function that is very similar to doing probabilistic regression with a Gauss/Laplace model of p(y|x) (with learned mean and variance)? See Figure 4 in the Appendix.
And while it's true that their performance is much better than for direct regression with an L2/L1 loss (see e.g. Table 1), they only compare with Gauss/Laplace probabilistic regression once (Table 7) and in that case the Laplace model is actually quite competitive?
- NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis [pdf] [code] [annotated pdf]
- Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, Ren Ng
2020-03-19, ECCV 2020
- [Implicit Neural Representations]
Extremely well-written and interesting paper. I really enjoyed reading it, and I would recommend anyone interested in computer vision to read it as well.
All parts of the proposed method are clearly explained and relatively easy to understand, including the volume rendering techniques which I was unfamiliar with.
- Revisiting the Calibration of Modern Neural Networks [pdf] [code] [annotated pdf]
- Matthias Minderer, Josip Djolonga, Rob Romijnders, Frances Hubis, Xiaohua Zhai, Neil Houlsby, Dustin Tran, Mario Lucic
2021-06-15, NeurIPS 2021
- [Uncertainty Estimation]
Well-written paper. Everything is quite clearly explained and easy to understand. Quite enjoyable to read overall.
Thorough experimental evaluation. Quite interesting findings.
- Differentiable Particle Filtering via Entropy-Regularized Optimal Transport [pdf] [code] [annotated pdf]
- Adrien Corenflos, James Thornton, George Deligiannidis, Arnaud Doucet
2021-02-15, ICML 2021
- Character Controllers Using Motion VAEs [pdf] [code] [annotated pdf]
- Hung Yu Ling, Fabio Zinno, George Cheng, Michiel van de Panne
2021-03-26, SIGGRAPH 2020
- [3D Human Pose Estimation]
- DeepSDF: Learning Continuous Signed Distance Functions for Shape Representation [pdf] [code] [annotated pdf]
- Jeong Joon Park, Peter Florence, Julian Straub, Richard Newcombe, Steven Lovegrove
2019-01-16, CVPR 2019
- [Implicit Neural Representations]
- Generating Multiple Hypotheses for 3D Human Pose Estimation with Mixture Density Network [pdf] [code] [annotated pdf]
- Chen Li, Gim Hee Lee
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
- Expressive Body Capture: 3D Hands, Face, and Body from a Single Image [pdf] [code] [annotated pdf]
- Georgios Pavlakos, Vasileios Choutas, Nima Ghorbani, Timo Bolkart, Ahmed A. A. Osman, Dimitrios Tzionas, Michael J. Black
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
Very well-written and quite interesting paper. Gives a good understanding of the SMPL model and the SMPLify method.
- Keep it SMPL: Automatic Estimation of 3D Human Pose and Shape from a Single Image [pdf] [annotated pdf]
- Federica Bogo, Angjoo Kanazawa, Christoph Lassner, Peter Gehler, Javier Romero, Michael J. Black
2016-07-27, ECCV 2016
- [3D Human Pose Estimation]
- Beyond Static Features for Temporally Consistent 3D Human Pose and Shape from a Video [pdf] [code] [annotated pdf]
- Hongsuk Choi, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee
2020-11-17, CVPR 2021
- [3D Human Pose Estimation]
- Exemplar Fine-Tuning for 3D Human Model Fitting Towards In-the-Wild 3D Human Pose Estimation [pdf] [code] [annotated pdf]
- Hanbyul Joo, Natalia Neverova, Andrea Vedaldi
2020-04-07
- [3D Human Pose Estimation]
- Learning to Reconstruct 3D Human Pose and Shape via Model-fitting in the Loop [pdf] [code] [annotated pdf]
- Nikos Kolotouros, Georgios Pavlakos, Michael J. Black, Kostas Daniilidis
2019-09-27, ICCV 2019
- [3D Human Pose Estimation]
- A simple yet effective baseline for 3d human pose estimation [pdf] [code] [annotated pdf]
- Julieta Martinez, Rayat Hossain, Javier Romero, James J. Little
2017-05-08, ICCV 2017
- [3D Human Pose Estimation]
- Estimating Egocentric 3D Human Pose in Global Space [pdf] [annotated pdf]
- Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Christian Theobalt
2021-04-27, ICCV 2021
- [3D Human Pose Estimation]
- End-to-end Recovery of Human Shape and Pose [pdf] [code] [annotated pdf]
- Angjoo Kanazawa, Michael J. Black, David W. Jacobs, Jitendra Malik
2017-12-18, CVPR 2018
- [3D Human Pose Estimation]
- 3D Multi-bodies: Fitting Sets of Plausible 3D Human Models to Ambiguous Image Data [pdf] [annotated pdf]
- Benjamin Biggs, Sébastien Ehrhadt, Hanbyul Joo, Benjamin Graham, Andrea Vedaldi, David Novotny
2020-11-02, NeurIPS 2020
- [3D Human Pose Estimation]
- HuMoR: 3D Human Motion Model for Robust Pose Estimation [pdf] [code] [annotated pdf]
- Davis Rempe, Tolga Birdal, Aaron Hertzmann, Jimei Yang, Srinath Sridhar, Leonidas J. Guibas
2021-05-10, ICCV 2021
- [3D Human Pose Estimation]
- PixelTransformer: Sample Conditioned Signal Generation [pdf] [code] [annotated pdf]
- Shubham Tulsiani, Abhinav Gupta
2021-03-29, ICML 2021
- [Neural Processes] [Transformers]
- Stiff Neural Ordinary Differential Equations [pdf] [annotated pdf]
- Suyong Kim, Weiqi Ji, Sili Deng, Yingbo Ma, Christopher Rackauckas
2021-03-29
- [Neural ODEs]
- Learning Mesh-Based Simulation with Graph Networks [pdf] [code] [annotated pdf]
- Tobias Pfaff, Meire Fortunato, Alvaro Sanchez-Gonzalez, Peter W. Battaglia
2020-10-07, ICLR 2021
- Q-Learning in enormous action spaces via amortized approximate maximization [pdf] [annotated pdf]
- Tom Van de Wiele, David Warde-Farley, Andriy Mnih, Volodymyr Mnih
2020-01-22
- [Reinforcement Learning]
- Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling [pdf] [code] [annotated pdf]
- Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson
2021-02-25, ICML 2021
- [Uncertainty Estimation] [Ensembling]
- Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling [pdf] [pdf with comments]
- Tong Che, Ruixiang Zhang, Jascha Sohl-Dickstein, Hugo Larochelle, Liam Paull, Yuan Cao, Yoshua Bengio
2020-03-12, NeurIPS 2020
- [Energy-Based Models]
- Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability [pdf] [pdf with comments]
- Jeremy M. Cohen, Simran Kaur, Yuanzhi Li, J. Zico Kolter, Ameet Talwalkar
2021-02-26, ICLR 2021
- [Theoretical Properties of Deep Learning]
- Unsupervised Learning of Visual Features by Contrasting Cluster Assignments [pdf] [code] [pdf with comments]
- Mathilde Caron, Ishan Misra, Julien Mairal, Priya Goyal, Piotr Bojanowski, Armand Joulin
2020-06-17, NeurIPS 2020
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations [pdf] [pdf with comments]
- Winnie Xu, Ricky T.Q. Chen, Xuechen Li, David Duvenaud
2021-02-12
- [Neural ODEs] [Uncertainty Estimation]
- Neural Relational Inference for Interacting Systems [pdf] [code] [pdf with comments]
- Thomas Kipf, Ethan Fetaya, Kuan-Chieh Wang, Max Welling, Richard Zemel
2018-02-13, ICML 2018
- Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision [pdf] [pdf with comments]
- Chao Jia, Yinfei Yang, Ye Xia, Yi-Ting Chen, Zarana Parekh, Hieu Pham, Quoc V. Le, Yunhsuan Sung, Zhen Li, Tom Duerig
2021-02-11, ICML 2021
- On the Origin of Implicit Regularization in Stochastic Gradient Descent [pdf] [pdf with comments]
- Samuel L. Smith, Benoit Dherin, David G. T. Barrett, Soham De
2021-01-28, ICLR 2021
- [Theoretical Properties of Deep Learning]
- Meta Pseudo Labels [pdf] [code] [pdf with comments]
- Hieu Pham, Zihang Dai, Qizhe Xie, Minh-Thang Luong, Quoc V. Le
2020-03-23, CVPR 2021
- No MCMC for Me: Amortized Sampling for Fast and Stable Training of Energy-Based Models [pdf] [code] [pdf with comments]
- Will Grathwohl, Jacob Kelly, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud
2020-10-08, ICLR 2021
- [Energy-Based Models]
- Getting a CLUE: A Method for Explaining Uncertainty Estimates [pdf] [pdf with comments]
- Javier Antorán, Umang Bhatt, Tameem Adel, Adrian Weller, José Miguel Hernández-Lobato
2020-06-11, ICLR 2021
- [Uncertainty Estimation]
- Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention [pdf] [pdf with comments]
- Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, François Fleuret
2020-06-29, ICML 2020
- [Transformers]
- Score-Based Generative Modeling through Stochastic Differential Equations [pdf] [code] [pdf with comments]
- Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
2020-11-26, ICLR 2021
- [Neural ODEs]
- Dissecting Neural ODEs [pdf] [pdf with comments]
- Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-02-19, NeurIPS 2020
- [Neural ODEs]
- Rethinking Attention with Performers [pdf] [pdf with comments]
- Krzysztof Choromanski, Valerii Likhosherstov, David Dohan, Xingyou Song, Andreea Gane, Tamas Sarlos, Peter Hawkins, Jared Davis, Afroz Mohiuddin, Lukasz Kaiser, David Belanger, Lucy Colwell, Adrian Weller
2020-10-30, ICLR 2021
- [Transformers]
- Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images [pdf] [code] [pdf with comments]
- Rewon Child
2020-11-20, ICLR 2021
- [VAEs]
- VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models [pdf] [pdf with comments]
- Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
2020-10-01, ICLR 2021
- [Energy-Based Models] [VAEs]
- Approximate Inference Turns Deep Networks into Gaussian Processes [pdf] [pdf with comments]
- Mohammad Emtiyaz Khan, Alexander Immer, Ehsan Abedi, Maciej Korzepa
2019-06-05, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Implicit Gradient Regularization [pdf] [pdf with comments] [comments]
- David G.T. Barrett, Benoit Dherin
2020-09-23
- [Theoretical Properties of Deep Learning]
- Satellite Conjunction Analysis and the False Confidence Theorem [pdf] [pdf with comments] [comments]
- Michael Scott Balch, Ryan Martin, Scott Ferson
2018-03-21
- Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness [pdf] [pdf with comments] [comments]
- Jeremiah Zhe Liu, Zi Lin, Shreyas Padhy, Dustin Tran, Tania Bedrax-Weiss, Balaji Lakshminarayanan
2020-06-17, NeurIPS 2020
- [Uncertainty Estimation]
- Uncertainty Estimation Using a Single Deep Deterministic Neural Network [pdf] [code] [pdf with comments] [comments]
- Joost van Amersfoort, Lewis Smith, Yee Whye Teh, Yarin Gal
2020-03-04, ICML 2020
- [Uncertainty Estimation]
- Gated Linear Networks [pdf] [pdf with comments] [comments]
- Joel Veness, Tor Lattimore, David Budden, Avishkar Bhoopchand, Christopher Mattern, Agnieszka Grabska-Barwinska, Eren Sezener, Jianan Wang, Peter Toth, Simon Schmitt, Marcus Hutter
2020-06-11
- Denoising Diffusion Probabilistic Models [pdf] [code] [pdf with comments] [comments]
- Jonathan Ho, Ajay Jain, Pieter Abbeel
20-06-19
- [Energy-Based Models]
- Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [pdf] [code] [pdf with comments] [comments]
- Tian Han, Erik Nijkamp, Linqi Zhou, Bo Pang, Song-Chun Zhu, Ying Nian Wu
2020-06-10, CVPR 2020
- [VAEs] [Energy-Based Models]
- End-to-End Object Detection with Transformers [pdf] [code] [pdf with comments] [comments]
- Nicolas Carion, Francisco Massa, Gabriel Synnaeve, Nicolas Usunier, Alexander Kirillov, Sergey Zagoruyko
2020-05-26, ECCV 2020
- [Object Detection]
- Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors [pdf] [code] [pdf with comments] [comments]
- Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-an Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran
2020-05-14, ICML 2020
- [Uncertainty Estimation] [Variational Inference]
- BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning [pdf] [code] [video] [pdf with comments] [comments]
- Yeming Wen, Dustin Tran, Jimmy Ba
2020-02-17, ICLR 2020
- [Uncertainty Estimation] [Ensembling]
- Stable Neural Flows [pdf] [pdf with comments] [comments]
- Stefano Massaroli, Michael Poli, Michelangelo Bin, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-03-18
- How Good is the Bayes Posterior in Deep Neural Networks Really? [pdf] [pdf with comments] [comments]
- Florian Wenzel, Kevin Roth, Bastiaan S. Veeling, Jakub Świątkowski, Linh Tran, Stephan Mandt, Jasper Snoek, Tim Salimans, Rodolphe Jenatton, Sebastian Nowozin
2020-02-06
- [Uncertainty Estimation] [Stochastic Gradient MCMC]
- Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration [pdf] [code] [poster] [slides] [video] [pdf with comments] [comments]
- Meelis Kull, Miquel Perello-Nieto, Markus Kängsepp, Telmo Silva Filho, Hao Song, Peter Flach
2019-10-28, NeurIPS 2019
- [Uncertainty Estimation]
- Normalizing Flows: An Introduction and Review of Current Methods [pdf] [pdf with comments] [comments]
- Ivan Kobyzev, Simon Prince, Marcus A. Brubaker
2019-08-25
- [Normalizing Flows]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Conservative Uncertainty Estimation By Fitting Prior Networks [pdf] [pdf with comments] [comments]
- Kamil Ciosek, Vincent Fortuin, Ryota Tomioka, Katja Hofmann, Richard Turner
2019-10-25, ICLR 2020
- [Uncertainty Estimation]
- Batch Normalization Biases Deep Residual Networks Towards Shallow Paths [pdf] [pdf with comments] [comments]
- Soham De, Samuel L. Smith
2020-02-24
- [Theoretical Properties of Deep Learning]
- Bayesian Deep Learning and a Probabilistic Perspective of Generalization [pdf] [code] [pdf with comments] [comments]
- Andrew Gordon Wilson, Pavel Izmailov
2020-02-20
- [Uncertainty Estimation] [Ensembling]
- Convolutional Conditional Neural Processes [pdf] [code] [pdf with comments] [comments]
- Jonathan Gordon, Wessel P. Bruinsma, Andrew Y. K. Foong, James Requeima, Yann Dubois, Richard E. Turner
2019-10-29, ICLR 2020
- [Neural Processes]
- Probabilistic 3D Multi-Object Tracking for Autonomous Driving [pdf] [code] [pdf with comments] [comments]
- Hsu-kuang Chiu, Antonio Prioletti, Jie Li, Jeannette Bohg
2020-01-16
- [3D Multi-Object Tracking]
- A Baseline for 3D Multi-Object Tracking [pdf] [code] [pdf with comments] [comments]
- Xinshuo Weng, Kris Kitani
2019-07-09
- [3D Multi-Object Tracking]
- A Contrastive Divergence for Combining Variational Inference and MCMC [pdf] [code] [slides] [pdf with comments] [comments]
- Francisco J. R. Ruiz, Michalis K. Titsias
2019-05-10, ICML 2019
- [VAEs]
- Decomposition of Uncertainty in Bayesian Deep Learning for Efficient and Risk-sensitive Learning [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-10-19, ICML 2018
- [Uncertainty Estimation] [Reinforcement Learning]
- Uncertainty Decomposition in Bayesian Neural Networks with Latent Variables [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-06-26
- [Uncertainty Estimation] [Reinforcement Learning]
- Modelling heterogeneous distributions with an Uncountable Mixture of Asymmetric Laplacians [pdf] [code] [video] [pdf with comments] [comments]
- Axel Brando, Jose A. Rodríguez-Serrano, Jordi Vitrià, Alberto Rubio
2019-10-27, NeurIPS 2019
- [Uncertainty Estimation]
- A Primal-Dual link between GANs and Autoencoders [pdf] [poster] [pdf with comments] [comments]
- Hisham Husain, Richard Nock, Robert C. Williamson
2019-04-26, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- A Connection Between Score Matching and Denoising Autoencoders [pdf] [pdf with comments] [comments]
- Pascal Vincent
2010-12
- [Energy-Based Models]
- Multiplicative Interactions and Where to Find Them [pdf] [pdf with comments] [comments]
- Siddhant M. Jayakumar, Jacob Menick, Wojciech M. Czarnecki, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu
2019-09-25, ICLR 2020
- [Theoretical Properties of Deep Learning] [Sequence Modeling]
- Estimation of Non-Normalized Statistical Models by Score Matching [pdf] [pdf with comments] [comments]
- Aapo Hyvärinen
2004-11, JMLR 6
- [Energy-Based Models]
- Generative Modeling by Estimating Gradients of the Data Distribution [pdf] [code] [poster] [pdf with comments] [comments]
- Yang Song, Stefano Ermon
2019-07-12, NeurIPS 2019
- [Energy-Based Models]
- Noise-contrastive estimation: A new estimation principle for unnormalized statistical models [pdf] [pdf with comments] [comments]
- Michael Gutmann, Aapo Hyvärinen
2009, AISTATS 2010
- [Energy-Based Models]
- Z-Forcing: Training Stochastic Recurrent Networks [pdf] [code] [pdf with comments] [comments]
- Anirudh Goyal, Alessandro Sordoni, Marc-Alexandre Côté, Nan Rosemary Ke, Yoshua Bengio
2017-11-15, NeurIPS 2017
- [VAEs] [Sequence Modeling]
- Practical Deep Learning with Bayesian Principles [pdf] [code] [pdf with comments] [comments]
- Kazuki Osawa, Siddharth Swaroop, Anirudh Jain, Runa Eschenhagen, Richard E. Turner, Rio Yokota, Mohammad Emtiyaz Khan
2019-06-06, NeurIPS 2019
- [Uncertainty Estimation] [Variational Inference]
- Maximum Entropy Generators for Energy-Based Models [pdf] [code] [pdf with comments] [comments]
- Rithesh Kumar, Sherjil Ozair, Anirudh Goyal, Aaron Courville, Yoshua Bengio
2019-01-24
- [Energy-Based Models]
- Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One [pdf] [pdf with comments] [comments]
- Will Grathwohl, Kuan-Chieh Wang, Jörn-Henrik Jacobsen, David Duvenaud, Mohammad Norouzi, Kevin Swersky
2019-12-06, ICLR 2020
- [Energy-Based Models]
- Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency [pdf] [pdf with comments] [comments]
- Zhuang Ma, Michael Collins
2018-09-06, EMNLP 2018
- [Energy-Based Models]
- Flow Contrastive Estimation of Energy-Based Models [pdf] [pdf with comments] [comments]
- Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, Ying Nian Wu
2019-12-02, CVPR 2020
- [Energy-Based Models] [Normalizing Flows]
- On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models [pdf] [code] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Tian Han, Song-Chun Zhu, Ying Nian Wu
2019-04-29, AAAI 2020
- [Energy-Based Models]
- Implicit Generation and Generalization in Energy-Based Models [pdf] [code] [blog] [pdf with comments] [comments]
- Yilun Du, Igor Mordatch
2019-04-20, NeurIPS 2019
- [Energy-Based Models]
- Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model [pdf] [poster] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu
2019-04-22, NeurIPS 2019
- [Energy-Based Models]
- A Tutorial on Energy-Based Learning [pdf] [pdf with comments] [comments]
- Yann LeCun, Sumit Chopra, Raia Hadsell, Marc Aurelio Ranzato, Fu Jie Huang
2006-08-19
- [Energy-Based Models]
- Dream to Control: Learning Behaviors by Latent Imagination [pdf] [webpage] [pdf with comments] [comments]
- Anonymous
2019-09
- Deep Latent Variable Models for Sequential Data [pdf] [pdf with comments] [comments]
- Marco Fraccaro
2018-04-13, PhD Thesis
- Learning Latent Dynamics for Planning from Pixels [pdf] [code] [blog] [pdf with comments] [comments]
- Danijar Hafner, Timothy Lillicrap, Ian Fischer, Ruben Villegas, David Ha, Honglak Lee, James Davidson
2018-11-12, ICML2019
- Learning nonlinear state-space models using deep autoencoders [pdf] [pdf with comments] [comments]
- Daniele Masti, Alberto Bemporad
2018, CDC2018
- Improving Variational Inference with Inverse Autoregressive Flow [pdf] [code] [pdf with comments] [comments]
- Diederik P. Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, Max Welling
2016-06-15, NeurIPS2016
- Variational Inference with Normalizing Flows [pdf] [pdf with comments] [comments]
- Danilo Jimenez Rezende, Shakir Mohamed
2015-05-21, ICML2015
- Trellis Networks for Sequence Modeling [pdf] [code] [pdf with comments] [comments]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-10-15, ICLR2019
- Part-A^2 Net: 3D Part-Aware and Aggregation Neural Network for Object Detection from Point Cloud [pdf] [pdf with comments] [comments]
- Shaoshuai Shi, Zhe Wang, Xiaogang Wang, Hongsheng Li
2019-07-08
- PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud [pdf] [code] [pdf with comments] [comments]
- Shaoshuai Shi, Xiaogang Wang, Hongsheng Li
2018-12-11, CVPR2019
- Objects as Points [pdf] [code] [pdf with comments] [comments]
- Xingyi Zhou, Dequan Wang, Philipp Krähenbühl
2019-04-16
- ATOM: Accurate Tracking by Overlap Maximization [pdf] [code] [pdf with comments] [comments]
- Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg
2018-11-19, CVPR2019
- Acquisition of Localization Confidence for Accurate Object Detection [pdf] [code] [oral presentation] [pdf with comments] [comments]
- Borui Jiang, Ruixuan Luo, Jiayuan Mao, Tete Xiao, Yuning Jiang
2018-07-30, ECCV2018
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- Attention Is All You Need [pdf] [pdf with comments] [comments]
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
2017-06-12, NeurIPS2017
- Stochastic Gradient Descent as Approximate Bayesian Inference [pdf] [pdf with comments] [comments]
- Stephan Mandt, Matthew D. Hoffman, David M. Blei
2017-04-13, Journal of Machine Learning Research 18 (2017)
- Generating High Fidelity Images with Subscale Pixel Networks and Multidimensional Upscaling [pdf] [pdf with comments] [comments]
- Jacob Menick, Nal Kalchbrenner
2018-12-04, ICLR2019
- A recurrent neural network without chaos [pdf] [pdf with comments] [comments]
- Thomas Laurent, James von Brecht
2016-12-19, ICLR2017
- Auto-Encoding Variational Bayes [pdf] [pdf with comments (TODO!)] [comments (TOOD!)]
- Diederik P Kingma, Max Welling
2014-05-01, ICLR2014
- Coupled Variational Bayes via Optimization Embedding [pdf] [poster] [code] [pdf with comments] [comments]
- Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
NeurIPS2018
- Language Models are Unsupervised Multitask Learners [pdf] [blog post] [code] [pdf with comments] [comments]
- Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever
2019-02-14
- Predictive Uncertainty Estimation via Prior Networks [pdf] [pdf with comments] [comments]
- Andrey Malinin, Mark Gales
2018-02-28, NeurIPS2018
- Evaluating model calibration in classification [pdf] [code] [pdf with comments] [comments]
- Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
2019-02-19, AISTATS2019
- Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks [pdf] [pdf with comments] [comments]
- Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Ruosong Wang
2019-01-24
- Visualizing the Loss Landscape of Neural Nets [pdf] [code] [pdf with comments] [comments]
- Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein
2017-12-28, NeurIPS2018
- A Simple Baseline for Bayesian Uncertainty in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson
2019-02-07
- Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning [pdf] [code] [pdf with comments] [comments]
- Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
2019-02-11
- Bayesian Dark Knowledge [pdf] [pdf with comments] [comments]
- Anoop Korattikara, Vivek Rathod, Kevin Murphy, Max Welling
2015-06-07, NeurIPS2015
- Noisy Natural Gradient as Variational Inference [pdf] [video] [code] [pdf with comments] [comments]
- Guodong Zhang, Shengyang Sun, David Duvenaud, Roger Grosse
2017-12-06, ICML2018
- Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks [pdf] [pdf with comments] [comments]
- José Miguel Hernández-Lobato, Ryan P. Adams
2015-07-15, ICML2015
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- Practical Variational Inference for Neural Networks [pdf] [pdf with comments] [comments]
- Alex Graves
NeurIPS2011
- Weight Uncertainty in Neural Networks [pdf] [pdf with comments] [comments]
- Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra
2015-05-20, ICML2015
- Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification [pdf] [poster] [pdf with comments] [comments]
- Chunyuan Li, Andrew Stevens, Changyou Chen, Yunchen Pu, Zhe Gan, Lawrence Carin
CVPR2016
- Meta-Learning For Stochastic Gradient MCMC [pdf] [code] [slides] [pdf with comments] [summary (TODO!)]
- Wenbo Gong, Yingzhen Li, José Miguel Hernández-Lobato
2018-10-28, ICLR2019
- A Complete Recipe for Stochastic Gradient MCMC [pdf] [pdf with comments] [summary]
- Yi-An Ma, Tianqi Chen, Emily B. Fox
2015-06-15, NeurIPS2015
- Tutorial: Introduction to Stochastic Gradient Markov Chain Monte Carlo Methods [pdf] [pdf with comments]
- Changyou Chen
2016-08-10
- An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling [pdf] [code] [pdf with comments] [summary]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-04-19
- Stochastic Gradient Hamiltonian Monte Carlo [pdf] [pdf with comments] [summary (TODO!)]
- Tianqi Chen, Emily B. Fox, Carlos Guestrin
2014-05-12, ICML2014
- Bayesian Learning via Stochastic Gradient Langevin Dynamics [pdf] [pdf with comments] [summary (TODO!)]
- Max Welling, Yee Whye Teh
ICML2011
- How Does Batch Normalization Help Optimization? [pdf] [poster] [video] [pdf with comments] [summary]
- Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, Aleksander Madry
2018-10-27, NeurIPS2018
- Relaxed Softmax: Efficient Confidence Auto-Calibration for Safe Pedestrian Detection [pdf] [poster] [pdf with comments] [summary]
- Lukas Neumann, Andrew Zisserman, Andrea Vedaldi
2018-11-29, NeurIPS2018 Workshop
- Neural Ordinary Differential Equations [pdf] [code] [slides] [pdf with comments] [summary]
- Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud
2018-10-22, NeurIPS2018
- [Neural ODEs]
- Evaluating Bayesian Deep Learning Methods for Semantic Segmentation [pdf] [pdf with comments] [summary]
- Jishnu Mukhoti, Yarin Gal
2018-11-30
- On Calibration of Modern Neural Networks [pdf] [code] [pdf with comments] [summary]
- Chuan Guo, Geoff Pleiss, Yu Sun, Kilian Q. Weinberger
2017-08-03, ICML2017
- Evidential Deep Learning to Quantify Classification Uncertainty [pdf] [poster] [code example] [pdf with comments] [summary]
- Murat Sensoy, Lance Kaplan, Melih Kandemir
2018-10-31, NeurIPS2018
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- When Recurrent Models Don't Need To Be Recurrent (a.k.a. Stable Recurrent Models) [pdf] [pdf with comments] [summary]
- John Miller, Moritz Hardt
2018-05-29, ICLR2019
- Uncertainty Estimates and Multi-Hypotheses Networks for Optical Flow [pdf] [pdf with comments] [summary]
- Eddy Ilg, Özgün Çiçek, Silvio Galesso, Aaron Klein, Osama Makansi, Frank Hutter, Thomas Brox
2018-08-06, ECCV2018
- Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV) [pdf] [pdf with comments] [summary]
- Been Kim, Martin Wattenberg, Justin Gilmer, Carrie Cai, James Wexler, Fernanda Viegas, Rory Sayres
2018-06-07, ICML2018
- Large-Scale Visual Active Learning with Deep Probabilistic Ensembles [pdf] [pdf with comments] [summary]
- Kashyap Chitta, Jose M. Alvarez, Adam Lesnikowski
2018-11-08
- The Lottery Ticket Hypothesis: Finding Small, Trainable Neural Networks [pdf] [pdf with comments] [summary]
- Jonathan Frankle, Michael Carbin
2018-03-09, ICLR2019
- Towards Safe Autonomous Driving: Capture Uncertainty in the Deep Neural Network For Lidar 3D Vehicle Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Klaus Dietmayer
2018-09-08, ITSC2018
- Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes [pdf] [pdf with comments] [summary]
- Roman Novak, Lechao Xiao, Jaehoon Lee, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein
2018-10-11, ICLR2019
- Uncertainty in Neural Networks: Bayesian Ensembling [pdf] [pdf with comments] [summary]
- Tim Pearce, Mohamed Zaki, Alexandra Brintrup, Andy Neel
2018-10-12, AISTATS2019 submission
- Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles [pdf] [pdf with comments] [summary]
- Balaji Lakshminarayanan, Alexander Pritzel, Charles Blundell
2017-11-17, NeurIPS2017
- Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors [pdf] [pdf with comments] [summary]
- Danijar Hafner, Dustin Tran, Alex Irpan, Timothy Lillicrap, James Davidson
2018-07-24, ICML2018 Workshop
- VoxelNet: End-to-End Learning for Point Cloud Based 3D Object Detection [pdf] [pdf with comments] [summary]
- Yin Zhou, Oncel Tuzel
2017-11-17, CVPR2018
- PIXOR: Real-time 3D Object Detection from Point Clouds [pdf] [pdf with comments] [summary]
- Bin Yang, Wenjie Luo, Raquel Urtasun
CVPR2018
- On gradient regularizers for MMD GANs [pdf] [pdf with comments] [summary]
- Michael Arbel, Dougal J. Sutherland, Mikołaj Bińkowski, Arthur Gretton
2018-05-29, NeurIPS2018
- Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S.M. Ali Eslami, Yee Whye Teh
2018-07-04, ICML2018 Workshop
- Conditional Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, S. M. Ali Eslami
2018-07-04, ICML2018
- Neural Autoregressive Flows [pdf] [pdf with comments] [summary]
- Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
2018-04-03, ICML2018
- Deep Confidence: A Computationally Efficient Framework for Calculating Reliable Errors for Deep Neural Networks [pdf] [pdf with comments] [summary]
- Isidro Cortes-Ciriano, Andreas Bender
2018-09-24
- Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Fabian Timm, Klaus Dietmayer
2018-09-14
- Lightweight Probabilistic Deep Networks [pdf] [pdf with comments] [summary]
- Jochen Gast, Stefan Roth
2018-05-29, CVPR2018
- What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision? [pdf] [pdf with comments] [summary]
- Alex Kendall, Yarin Gal
2017-10-05, NeurIPS2017
- Gaussian Process Behaviour in Wide Deep Neural Networks [pdf] [pdf with comments] [summary]
- Alexander G. de G. Matthews, Mark Rowland, Jiri Hron, Richard E. Turner, Zoubin Ghahramani
2018-08-16, ICLR2018
- A Deep Bayesian Neural Network for Cardiac Arrhythmia Classification with Rejection from ECG Recordings [pdf] [code] [annotated pdf]
- Wenrui Zhang, Xinxin Di, Guodong Wei, Shijia Geng, Zhaoji Fu, Shenda Hong
2022-02-26
- [Uncertainty Estimation] [Medical ML]
Somewhat interesting paper. They use a softmax model with MC-dropout to compute uncertainty estimates. The evaluation is not very extensive, they mostly just check that the classification accuracy improves as they reject more and more samples based on a uncertainty threshold.
- UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography [pdf] [annotated pdf]
- Francisca Vasconcelos, Bobby He, Nalini Singh, Yee Whye Teh
2022-02-22
- [Implicit Neural Representations] [Uncertainty Estimation] [Medical ML]
Interesting and well-written paper. I wasn't very familiar with CT image reconstruction, but they do a good job explaining everything. Interesting that MC-dropout seems important for getting well-calibrated predictions.
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Robust Uncertainty Estimates with Out-of-Distribution Pseudo-Inputs Training [pdf] [annotated pdf]
- Pierre Segonne, Yevgen Zainchkovskyy, Søren Hauberg
2022-01-15
- [Uncertainty Estimation]
Somewhat interesting paper. I didn't quite understand everything, so it could be more interesting than I think. The fact that their pseudo-input generation process "relies on the availability of a differentiable density estimate of the data" seems like a big limitation? For regression, they only applied their method to very low-dimensional input data (1D toy regression and UCI benchmarks), but would this work for image-based tasks?
- Noise Contrastive Priors for Functional Uncertainty [pdf] [code] [annotated pdf]
- Danijar Hafner, Dustin Tran, Timothy Lillicrap, Alex Irpan, James Davidson
2018-07-24, UAI 2019
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Only experiments on a toy 1D regression problem, and flight delay prediction in which the input is 8D. The approach of just adding noise to the input x to get OOD samples would probably not work very well e.g. for image-based problems?
- Being a Bit Frequentist Improves Bayesian Neural Networks [pdf] [code] [annotated pdf]
- Agustinus Kristiadi, Matthias Hein, Philipp Hennig
2021-06-18, AISTATS 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method makes intuitive sense, trying to incorporate the "OOD training" method (i.e., to use some kind of OOD data during training, similar to e.g. the "Deep Anomaly Detection with Outlier Exposure" paper) into the Bayesian deep learning approach. The experimental results do seem quite promising.
- Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning [pdf] [code] [annotated pdf]
- Runa Eschenhagen, Erik Daxberger, Philipp Hennig, Agustinus Kristiadi
2021-11-95, NeurIPS Workshops 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. Short paper of just 3 pages, but with an extensive appendix which I definitely recommend going through. The method, training an ensemble and then applying the Laplace approximation to each network, is very simple and intuitively makes a lot of sense. I didn't realize that this would have basically the same test-time speed as ensembling (since they utilize that probit approximation), that's very neat. It also seems to consistently outperform ensembling a bit across almost all tasks and metrics.
- Pessimistic Bootstrapping for Uncertainty-Driven Offline Reinforcement Learning [pdf] [annotated pdf]
- Chenjia Bai, Lingxiao Wang, Zhuoran Yang, Zhi-Hong Deng, Animesh Garg, Peng Liu, Zhaoran Wang
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with RL, which makes it a bit difficult for me to properly evaluate the paper's contributions. They use standard ensembles for uncertainty estimation combined with an OOD sampling regularization. I thought that the OOD sampling could be interesting, but it seems very specific to RL. I'm sure this paper is quite interesting for people doing RL, but I don't think it's overly useful for me.
- On the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks [pdf] [code] [annotated pdf]
- Maximilian Seitzer, Arash Tavakoli, Dimitrije Antic, Georg Martius
2021-09-29, ICLR 2022
- [Uncertainty Estimation]
Quite interesting and very well-written paper, I enjoyed reading it. Their analysis of fitting Gaussian regression models via the NLL is quite interesting, I didn't really expect to learn something new about this. I've seen Gaussian models outperform standard regression (L2 loss) w.r.t. accuracy in some applications/datasets, and it being the other way around in others. In the first case, I've then attributed the success of the Gaussian model to the "learned loss attenuation". The analysis in this paper could perhaps explain why you get this performance boost only in certain applications. Their beta-NLL loss could probably be quite useful, seems like a convenient tool to have.
- Sample Efficient Deep Reinforcement Learning via Uncertainty Estimation [pdf] [annotated pdf]
- Vincent Mai, Kaustubh Mani, Liam Paull
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with reinforcement learning, which makes it a bit difficult for me to properly evaluate the paper's contributions, but to me it seems like fairly straightforward method modifications? To use ensembles of Gaussian models (instead of ensembles of models trained using the L2 loss) makes sense. The BIV method I didn't quite get, it seems rather ad hoc? I also don't quite get exactly how it's used in equation (10), is the ensemble of Gaussian models trained _jointly_ using this loss? I don't really know if this could be useful outside of RL.
- Laplace Redux -- Effortless Bayesian Deep Learning [pdf] [code] [annotated pdf]
- Erik Daxberger, Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Matthias Bauer, Philipp Hennig
2021-06-28, NeurIPS 2021
- [Uncertainty Estimation]
Interesting and very well-written paper, I enjoyed reading it. I still think that ensembling probably is quite difficult to beat purely in terms of uncertainty estimation quality, but this definitely seems like a useful tool in many situations. It's not clear to me if the analytical expression for regression in "4. Approximate Predictive Distribution" is applicable also if the variance is input-dependent?
- Benchmarking Uncertainty Quantification on Biosignal Classification Tasks under Dataset Shift [pdf] [annotated pdf]
- Tong Xia, Jing Han, Cecilia Mascolo
2021-12-16, AAAI Workshops 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. They synthetically create dataset shifts (e.g. by adding Gaussian noise to the data) of increasing intensity and study whether or not the uncertainty increases as the accuracy degrades. They compare regular softmax, temperature scaling, MC-dropout, ensembling and a simple variational inference method. Their conclusion is basically that ensembling slightly outperforms the other methods, but that no method performs overly well. I think these type of studies are really useful.
- Deep Evidential Regression [pdf] [code] [annotated pdf]
- Alexander Amini, Wilko Schwarting, Ava Soleimany, Daniela Rus
2019-10-07, NeurIPS 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. This is a good paper to read before "Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions". Their proposed method seems to have similar / slightly worse performance than a small ensemble, so the only real advantage is that it's faster at time-time? This is of course very important in many applications, but not in all. The performance also seems quite sensitive to the choice of lambda in the combined loss function (Equation (10)), according to Figure S2 in the appendix?
- Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions [pdf] [annotated pdf]
- Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. I didn't quite understand all the details, I'll have to read a couple of related/background papers to be able to properly appreciate and evaluate the proposed method. I definitely feel like I would like to read up on this family of methods. Extensive experimental evaluation, and the results seem promising overall.
- Periodic Activation Functions Induce Stationarity [pdf] [code] [annotated pdf]
- Lassi Meronen, Martin Trapp, Arno Solin
2021-10-26, NeurIPS 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Quite a heavy read, probably need to be rather familiar with GPs to properly understand/appreciate everything. Definitely check Appendix D, it gives a better understanding of how the proposed method is applied in practice. I'm not quite sure how strong/impressive the experimental results actually are. Also seems like the method could be a bit inconvenient to implement/use?
- Deep Classifiers with Label Noise Modeling and Distance Awareness [pdf] [annotated pdf]
- Vincent Fortuin, Mark Collier, Florian Wenzel, James Allingham, Jeremiah Liu, Dustin Tran, Balaji Lakshminarayanan, Jesse Berent, Rodolphe Jenatton, Effrosyni Kokiopoulou
2021-10-06
- [Uncertainty Estimation]
Quite interesting and well-written paper. I find the distance-awareness property more interesting than modelling of input/class-dependent label noise, so the proposed method (HetSNGP) is perhaps not overly interesting compared to the SNGP baseline.
- SMD-Nets: Stereo Mixture Density Networks [pdf] [code] [annotated pdf]
- Fabio Tosi, Yiyi Liao, Carolin Schmitt, Andreas Geiger
2021-04-08, CVPR 2021
- [Uncertainty Estimation]
Well-written and interesting paper. Quite easy to read and follow, the method is clearly explained and makes intuitive sense.
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Revisiting the Calibration of Modern Neural Networks [pdf] [code] [annotated pdf]
- Matthias Minderer, Josip Djolonga, Rob Romijnders, Frances Hubis, Xiaohua Zhai, Neil Houlsby, Dustin Tran, Mario Lucic
2021-06-15, NeurIPS 2021
- [Uncertainty Estimation]
Well-written paper. Everything is quite clearly explained and easy to understand. Quite enjoyable to read overall.
Thorough experimental evaluation. Quite interesting findings.
- Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling [pdf] [code] [annotated pdf]
- Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson
2021-02-25, ICML 2021
- [Uncertainty Estimation] [Ensembling]
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations [pdf] [pdf with comments]
- Winnie Xu, Ricky T.Q. Chen, Xuechen Li, David Duvenaud
2021-02-12
- [Neural ODEs] [Uncertainty Estimation]
- Getting a CLUE: A Method for Explaining Uncertainty Estimates [pdf] [pdf with comments]
- Javier Antorán, Umang Bhatt, Tameem Adel, Adrian Weller, José Miguel Hernández-Lobato
2020-06-11, ICLR 2021
- [Uncertainty Estimation]
- Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness [pdf] [pdf with comments] [comments]
- Jeremiah Zhe Liu, Zi Lin, Shreyas Padhy, Dustin Tran, Tania Bedrax-Weiss, Balaji Lakshminarayanan
2020-06-17, NeurIPS 2020
- [Uncertainty Estimation]
- Uncertainty Estimation Using a Single Deep Deterministic Neural Network [pdf] [code] [pdf with comments] [comments]
- Joost van Amersfoort, Lewis Smith, Yee Whye Teh, Yarin Gal
2020-03-04, ICML 2020
- [Uncertainty Estimation]
- Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors [pdf] [code] [pdf with comments] [comments]
- Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-an Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran
2020-05-14, ICML 2020
- [Uncertainty Estimation] [Variational Inference]
- BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning [pdf] [code] [video] [pdf with comments] [comments]
- Yeming Wen, Dustin Tran, Jimmy Ba
2020-02-17, ICLR 2020
- [Uncertainty Estimation] [Ensembling]
- How Good is the Bayes Posterior in Deep Neural Networks Really? [pdf] [pdf with comments] [comments]
- Florian Wenzel, Kevin Roth, Bastiaan S. Veeling, Jakub Świątkowski, Linh Tran, Stephan Mandt, Jasper Snoek, Tim Salimans, Rodolphe Jenatton, Sebastian Nowozin
2020-02-06
- [Uncertainty Estimation] [Stochastic Gradient MCMC]
- Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration [pdf] [code] [poster] [slides] [video] [pdf with comments] [comments]
- Meelis Kull, Miquel Perello-Nieto, Markus Kängsepp, Telmo Silva Filho, Hao Song, Peter Flach
2019-10-28, NeurIPS 2019
- [Uncertainty Estimation]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Conservative Uncertainty Estimation By Fitting Prior Networks [pdf] [pdf with comments] [comments]
- Kamil Ciosek, Vincent Fortuin, Ryota Tomioka, Katja Hofmann, Richard Turner
2019-10-25, ICLR 2020
- [Uncertainty Estimation]
- Bayesian Deep Learning and a Probabilistic Perspective of Generalization [pdf] [code] [pdf with comments] [comments]
- Andrew Gordon Wilson, Pavel Izmailov
2020-02-20
- [Uncertainty Estimation] [Ensembling]
- Decomposition of Uncertainty in Bayesian Deep Learning for Efficient and Risk-sensitive Learning [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-10-19, ICML 2018
- [Uncertainty Estimation] [Reinforcement Learning]
- Uncertainty Decomposition in Bayesian Neural Networks with Latent Variables [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-06-26
- [Uncertainty Estimation] [Reinforcement Learning]
- Modelling heterogeneous distributions with an Uncountable Mixture of Asymmetric Laplacians [pdf] [code] [video] [pdf with comments] [comments]
- Axel Brando, Jose A. Rodríguez-Serrano, Jordi Vitrià, Alberto Rubio
2019-10-27, NeurIPS 2019
- [Uncertainty Estimation]
- Practical Deep Learning with Bayesian Principles [pdf] [code] [pdf with comments] [comments]
- Kazuki Osawa, Siddharth Swaroop, Anirudh Jain, Runa Eschenhagen, Richard E. Turner, Rio Yokota, Mohammad Emtiyaz Khan
2019-06-06, NeurIPS 2019
- [Uncertainty Estimation] [Variational Inference]
- Acquisition of Localization Confidence for Accurate Object Detection [pdf] [code] [oral presentation] [pdf with comments] [comments]
- Borui Jiang, Ruixuan Luo, Jiayuan Mao, Tete Xiao, Yuning Jiang
2018-07-30, ECCV2018
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- Stochastic Gradient Descent as Approximate Bayesian Inference [pdf] [pdf with comments] [comments]
- Stephan Mandt, Matthew D. Hoffman, David M. Blei
2017-04-13, Journal of Machine Learning Research 18 (2017)
- Predictive Uncertainty Estimation via Prior Networks [pdf] [pdf with comments] [comments]
- Andrey Malinin, Mark Gales
2018-02-28, NeurIPS2018
- Evaluating model calibration in classification [pdf] [code] [pdf with comments] [comments]
- Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
2019-02-19, AISTATS2019
- A Simple Baseline for Bayesian Uncertainty in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson
2019-02-07
- Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning [pdf] [code] [pdf with comments] [comments]
- Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
2019-02-11
- Bayesian Dark Knowledge [pdf] [pdf with comments] [comments]
- Anoop Korattikara, Vivek Rathod, Kevin Murphy, Max Welling
2015-06-07, NeurIPS2015
- Noisy Natural Gradient as Variational Inference [pdf] [video] [code] [pdf with comments] [comments]
- Guodong Zhang, Shengyang Sun, David Duvenaud, Roger Grosse
2017-12-06, ICML2018
- Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks [pdf] [pdf with comments] [comments]
- José Miguel Hernández-Lobato, Ryan P. Adams
2015-07-15, ICML2015
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- Practical Variational Inference for Neural Networks [pdf] [pdf with comments] [comments]
- Alex Graves
NeurIPS2011
- Weight Uncertainty in Neural Networks [pdf] [pdf with comments] [comments]
- Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra
2015-05-20, ICML2015
- Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification [pdf] [poster] [pdf with comments] [comments]
- Chunyuan Li, Andrew Stevens, Changyou Chen, Yunchen Pu, Zhe Gan, Lawrence Carin
CVPR2016
- Meta-Learning For Stochastic Gradient MCMC [pdf] [code] [slides] [pdf with comments] [summary (TODO!)]
- Wenbo Gong, Yingzhen Li, José Miguel Hernández-Lobato
2018-10-28, ICLR2019
- A Complete Recipe for Stochastic Gradient MCMC [pdf] [pdf with comments] [summary]
- Yi-An Ma, Tianqi Chen, Emily B. Fox
2015-06-15, NeurIPS2015
- Tutorial: Introduction to Stochastic Gradient Markov Chain Monte Carlo Methods [pdf] [pdf with comments]
- Changyou Chen
2016-08-10
- Stochastic Gradient Hamiltonian Monte Carlo [pdf] [pdf with comments] [summary (TODO!)]
- Tianqi Chen, Emily B. Fox, Carlos Guestrin
2014-05-12, ICML2014
- Bayesian Learning via Stochastic Gradient Langevin Dynamics [pdf] [pdf with comments] [summary (TODO!)]
- Max Welling, Yee Whye Teh
ICML2011
- Relaxed Softmax: Efficient Confidence Auto-Calibration for Safe Pedestrian Detection [pdf] [poster] [pdf with comments] [summary]
- Lukas Neumann, Andrew Zisserman, Andrea Vedaldi
2018-11-29, NeurIPS2018 Workshop
- Evaluating Bayesian Deep Learning Methods for Semantic Segmentation [pdf] [pdf with comments] [summary]
- Jishnu Mukhoti, Yarin Gal
2018-11-30
- On Calibration of Modern Neural Networks [pdf] [code] [pdf with comments] [summary]
- Chuan Guo, Geoff Pleiss, Yu Sun, Kilian Q. Weinberger
2017-08-03, ICML2017
- Evidential Deep Learning to Quantify Classification Uncertainty [pdf] [poster] [code example] [pdf with comments] [summary]
- Murat Sensoy, Lance Kaplan, Melih Kandemir
2018-10-31, NeurIPS2018
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- Uncertainty Estimates and Multi-Hypotheses Networks for Optical Flow [pdf] [pdf with comments] [summary]
- Eddy Ilg, Özgün Çiçek, Silvio Galesso, Aaron Klein, Osama Makansi, Frank Hutter, Thomas Brox
2018-08-06, ECCV2018
- Large-Scale Visual Active Learning with Deep Probabilistic Ensembles [pdf] [pdf with comments] [summary]
- Kashyap Chitta, Jose M. Alvarez, Adam Lesnikowski
2018-11-08
- Towards Safe Autonomous Driving: Capture Uncertainty in the Deep Neural Network For Lidar 3D Vehicle Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Klaus Dietmayer
2018-09-08, ITSC2018
- Uncertainty in Neural Networks: Bayesian Ensembling [pdf] [pdf with comments] [summary]
- Tim Pearce, Mohamed Zaki, Alexandra Brintrup, Andy Neel
2018-10-12, AISTATS2019 submission
- Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles [pdf] [pdf with comments] [summary]
- Balaji Lakshminarayanan, Alexander Pritzel, Charles Blundell
2017-11-17, NeurIPS2017
- Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors [pdf] [pdf with comments] [summary]
- Danijar Hafner, Dustin Tran, Alex Irpan, Timothy Lillicrap, James Davidson
2018-07-24, ICML2018 Workshop
- Deep Confidence: A Computationally Efficient Framework for Calculating Reliable Errors for Deep Neural Networks [pdf] [pdf with comments] [summary]
- Isidro Cortes-Ciriano, Andreas Bender
2018-09-24
- Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Fabian Timm, Klaus Dietmayer
2018-09-14
- Lightweight Probabilistic Deep Networks [pdf] [pdf with comments] [summary]
- Jochen Gast, Stefan Roth
2018-05-29, CVPR2018
- What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision? [pdf] [pdf with comments] [summary]
- Alex Kendall, Yarin Gal
2017-10-05, NeurIPS2017
- Out of Distribution Data Detection Using Dropout Bayesian Neural Networks [pdf] [annotated pdf]
- Andre T. Nguyen, Fred Lu, Gary Lopez Munoz, Edward Raff, Charles Nicholas, James Holt
2022-02-18, AAAI 2022
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. It seemed quite niche at first, but I think their analysis could potentially be useful.
- Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks [pdf] [code] [annotated pdf]
- Shiyu Liang, Yixuan Li, R. Srikant
2017-06-08, ICLR 2018
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. Two simple modifications of the "maximum softmax score" baseline, and the performance is consistently improved. The input perturbation method is quite interesting. Intuitively, it's not entirely clear to me why it actually works.
- Confidence-based Out-of-Distribution Detection: A Comparative Study and Analysis [pdf] [code] [annotated pdf]
- Christoph Berger, Magdalini Paschali, Ben Glocker, Konstantinos Kamnitsas
2021-07-06, MICCAI Workshops 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and well-written paper. Interesting that Mahalanobis works very well on the CIFAR10 vs SVHN but not on the medical imaging dataset. I don't quite get how/why the ODIN method works, I'll probably have to read that paper.
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Contrastive Training for Improved Out-of-Distribution Detection [pdf] [annotated pdf]
- Jim Winkens, Rudy Bunel, Abhijit Guha Roy, Robert Stanforth, Vivek Natarajan, Joseph R. Ledsam, Patricia MacWilliams, Pushmeet Kohli, Alan Karthikesalingam, Simon Kohl, Taylan Cemgil, S. M. Ali Eslami, Olaf Ronneberger
2020-07-10
- [Out-of-Distribution Detection]
Quite interesting and very well-written paper. They take the method from the Mahalanobis paper ("A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks") (however, they fit Gaussians only to the features at the second-to-last network layer, and they don't use the input pre-processing either) and consistently improve OOD detection performance by incorporating contrastive training. Specifically, they first train the network using just the SimCLR loss for a large number of epochs, and then also add the standard classification loss. I didn't quite get why the label smoothing is necessary, but according to Table 2 it's responsible for a large portion of the performance gain.
- A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks [pdf] [code] [annotated pdf]
- Kimin Lee, Kibok Lee, Honglak Lee, Jinwoo Shin
2018-07-10, NeurIPS 2018
- [Out-of-Distribution Detection]
Well-written and interesting paper. The proposed method is simple and really neat: fit class-conditional Gaussians in the feature space of a pre-trained classifier (basically just LDA on the feature vectors), and then use the Mahalanobis distance to these Gaussians as the confidence score for input x. They then also do this for the features at multiple levels of the network and combine these confidence scores into one. I don't quite get why the "input pre-processing" in Section 2.2 (adding noise to test samples) works, in Table 1 it significantly improves the performance.
- Noise Contrastive Priors for Functional Uncertainty [pdf] [code] [annotated pdf]
- Danijar Hafner, Dustin Tran, Timothy Lillicrap, Alex Irpan, James Davidson
2018-07-24, UAI 2019
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Only experiments on a toy 1D regression problem, and flight delay prediction in which the input is 8D. The approach of just adding noise to the input x to get OOD samples would probably not work very well e.g. for image-based problems?
- Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions [pdf] [annotated pdf]
- Abhijit Guha Roy, Jie Ren, Shekoofeh Azizi, Aaron Loh, Vivek Natarajan, Basil Mustafa, Nick Pawlowski, Jan Freyberg, Yuan Liu, Zach Beaver, Nam Vo, Peggy Bui, Samantha Winter, Patricia MacWilliams, Greg S. Corrado, Umesh Telang, Yun Liu, Taylan Cemgil, Alan Karthikesalingam, Balaji Lakshminarayanan, Jim Winkens
2021-04-08, Medical Image Analysis (January 2022)
- [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. Quite long, so it took a bit longer than usual to read it. Section 1 and 2 gives a great overview of OOD detection in general, and how it can be used specifically in this dermatology setting. I can definitely recommend reading Section 2 (Related work). They assume access to some outlier data during training, so their approach is similar to the "Outlier exposure" method (specifically in this dermatology setting, they say that this is a fair assumption). Their method is an improvement of the "reject bucket" (add an extra class which you assign to all outlier training data points), in their proposed method they also use fine-grained classification of the outlier skin conditions. Then they also use an ensemble of 5 models, and also a more diverse ensemble (in which they combine models trained with different representation learning techniques). This diverse ensemble obtains the best performance.
- Being a Bit Frequentist Improves Bayesian Neural Networks [pdf] [code] [annotated pdf]
- Agustinus Kristiadi, Matthias Hein, Philipp Hennig
2021-06-18, AISTATS 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method makes intuitive sense, trying to incorporate the "OOD training" method (i.e., to use some kind of OOD data during training, similar to e.g. the "Deep Anomaly Detection with Outlier Exposure" paper) into the Bayesian deep learning approach. The experimental results do seem quite promising.
- Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning [pdf] [code] [annotated pdf]
- Runa Eschenhagen, Erik Daxberger, Philipp Hennig, Agustinus Kristiadi
2021-11-95, NeurIPS Workshops 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. Short paper of just 3 pages, but with an extensive appendix which I definitely recommend going through. The method, training an ensemble and then applying the Laplace approximation to each network, is very simple and intuitively makes a lot of sense. I didn't realize that this would have basically the same test-time speed as ensembling (since they utilize that probit approximation), that's very neat. It also seems to consistently outperform ensembling a bit across almost all tasks and metrics.
- Benchmarking Uncertainty Quantification on Biosignal Classification Tasks under Dataset Shift [pdf] [annotated pdf]
- Tong Xia, Jing Han, Cecilia Mascolo
2021-12-16, AAAI Workshops 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. They synthetically create dataset shifts (e.g. by adding Gaussian noise to the data) of increasing intensity and study whether or not the uncertainty increases as the accuracy degrades. They compare regular softmax, temperature scaling, MC-dropout, ensembling and a simple variational inference method. Their conclusion is basically that ensembling slightly outperforms the other methods, but that no method performs overly well. I think these type of studies are really useful.
- Deep Evidential Regression [pdf] [code] [annotated pdf]
- Alexander Amini, Wilko Schwarting, Ava Soleimany, Daniela Rus
2019-10-07, NeurIPS 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. This is a good paper to read before "Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions". Their proposed method seems to have similar / slightly worse performance than a small ensemble, so the only real advantage is that it's faster at time-time? This is of course very important in many applications, but not in all. The performance also seems quite sensitive to the choice of lambda in the combined loss function (Equation (10)), according to Figure S2 in the appendix?
- On Out-of-distribution Detection with Energy-based Models [pdf] [code] [annotated pdf]
- Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
2021-07-03, ICML Workshops 2021
- [Out-of-Distribution Detection] [Energy-Based Models]
Well-written and quite interesting paper. A short paper, just 4 pages. They don't study the method from the "Energy-based Out-of-distribution Detection" paper as I had expected, but it was still a quite interesting read. The results in Section 4.2 seem interesting, especially for experiment 3, but I'm not sure that I properly understand everything.
- Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions [pdf] [annotated pdf]
- Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. I didn't quite understand all the details, I'll have to read a couple of related/background papers to be able to properly appreciate and evaluate the proposed method. I definitely feel like I would like to read up on this family of methods. Extensive experimental evaluation, and the results seem promising overall.
- Energy-based Out-of-distribution Detection [pdf] [code] [annotated pdf]
- Weitang Liu, Xiaoyun Wang, John D. Owens, Yixuan Li
2020-10-08, NeurIPS 2020
- [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method is quite clearly explained and makes intuitive sense (at least if you're familiar with EBMs). Compared to using the softmax score, the performance does seem to improve consistently. Seems like fine-tuning on an "auxiliary outlier dataset" is required to get really good performance though, which you can't really assume to have access to in real-world problems, I suppose?
- VOS: Learning What You Don't Know by Virtual Outlier Synthesis [pdf] [code] [annotated pdf]
- Xuefeng Du, Zhaoning Wang, Mu Cai, Yixuan Li
2022-02-02, ICLR 2022
- [Out-of-Distribution Detection]
Interesting and quite well-written paper. I did find it somewhat difficult to understand certain parts though, they could perhaps be explained more clearly. The results seem quite impressive (they do consistently outperform all baselines), but I find it interesting that the "Gaussian noise" baseline in Table 2 performs that well? I should probably have read "Energy-based Out-of-distribution Detection" before reading this paper.
- Periodic Activation Functions Induce Stationarity [pdf] [code] [annotated pdf]
- Lassi Meronen, Martin Trapp, Arno Solin
2021-10-26, NeurIPS 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Quite a heavy read, probably need to be rather familiar with GPs to properly understand/appreciate everything. Definitely check Appendix D, it gives a better understanding of how the proposed method is applied in practice. I'm not quite sure how strong/impressive the experimental results actually are. Also seems like the method could be a bit inconvenient to implement/use?
- Reliable and Trustworthy Machine Learning for Health Using Dataset Shift Detection [pdf] [annotated pdf]
- Chunjong Park, Anas Awadalla, Tadayoshi Kohno, Shwetak Patel
2021-10-26, NeurIPS 2021
- [Out-of-Distribution Detection]
Interesting and very well-written paper. Gives a good overview of the field and contains a lot of seemingly useful references. The evaluation is very comprehensive. The user study is quite neat.
- On the Importance of Gradients for Detecting Distributional Shifts in the Wild [pdf] [code] [annotated pdf]
- Rui Huang, Andrew Geng, Yixuan Li
2021-10-01, NeurIPS 2021
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. The experimental results do seem promising. However, I don't quite get why the proposed method intuitively makes sense, why is it better to only use the parameters of the final network layer?
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Deep Learning Through the Lens of Example Difficulty [pdf] [annotated pdf]
- Robert John Nicholas Baldock, Hartmut Maennel, Behnam Neyshabur
2021-05-21, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite interesting and well-written paper. The definition of "prediction depth" in Section 2.1 makes sense, and it definitely seems reasonable that this could correlate with example difficulty / prediction confidence in some way. Section 3 and 4, and all the figures, contain a lot of info it seems, I'd probably need to read the paper again to properly understand/appreciate everything.
- An Information-theoretic Approach to Distribution Shifts [pdf] [code] [annotated pdf]
- Marco Federici, Ryota Tomioka, Patrick Forré
2021-06-07, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite well-written paper overall that seemed interesting, but I found it very difficult to properly understand everything. Thus, I can't really tell how interesting/significant their analysis actually is.
- Transferring Inductive Biases through Knowledge Distillation [pdf] [code] [annotated pdf]
- Samira Abnar, Mostafa Dehghani, Willem Zuidema
2020-05-31
- [Theoretical Properties of Deep Learning]
Quite well-written and somewhat interesting paper. I'm not very familiar with this area. I didn't spend too much time trying to properly evaluate the significance of the findings.
- Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets [pdf] [code] [annotated pdf]
- Alethea Power, Yuri Burda, Harri Edwards, Igor Babuschkin, Vedant Misra
2021-05, ICLR Workshops 2021
- [Theoretical Properties of Deep Learning]
Somewhat interesting paper. The phenomena observed in Figure 1, that validation accuracy suddenly increases long after almost perfect fitting of the training data has been achieved is quite interesting. I didn't quite understand the datasets they use (binary operation tables).
- Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability [pdf] [pdf with comments]
- Jeremy M. Cohen, Simran Kaur, Yuanzhi Li, J. Zico Kolter, Ameet Talwalkar
2021-02-26, ICLR 2021
- [Theoretical Properties of Deep Learning]
- On the Origin of Implicit Regularization in Stochastic Gradient Descent [pdf] [pdf with comments]
- Samuel L. Smith, Benoit Dherin, David G. T. Barrett, Soham De
2021-01-28, ICLR 2021
- [Theoretical Properties of Deep Learning]
- Approximate Inference Turns Deep Networks into Gaussian Processes [pdf] [pdf with comments]
- Mohammad Emtiyaz Khan, Alexander Immer, Ehsan Abedi, Maciej Korzepa
2019-06-05, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Implicit Gradient Regularization [pdf] [pdf with comments] [comments]
- David G.T. Barrett, Benoit Dherin
2020-09-23
- [Theoretical Properties of Deep Learning]
- Batch Normalization Biases Deep Residual Networks Towards Shallow Paths [pdf] [pdf with comments] [comments]
- Soham De, Samuel L. Smith
2020-02-24
- [Theoretical Properties of Deep Learning]
- A Primal-Dual link between GANs and Autoencoders [pdf] [poster] [pdf with comments] [comments]
- Hisham Husain, Richard Nock, Robert C. Williamson
2019-04-26, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Multiplicative Interactions and Where to Find Them [pdf] [pdf with comments] [comments]
- Siddhant M. Jayakumar, Jacob Menick, Wojciech M. Czarnecki, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu
2019-09-25, ICLR 2020
- [Theoretical Properties of Deep Learning] [Sequence Modeling]
- Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks [pdf] [pdf with comments] [comments]
- Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Ruosong Wang
2019-01-24
- Visualizing the Loss Landscape of Neural Nets [pdf] [code] [pdf with comments] [comments]
- Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein
2017-12-28, NeurIPS2018
- How Does Batch Normalization Help Optimization? [pdf] [poster] [video] [pdf with comments] [summary]
- Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, Aleksander Madry
2018-10-27, NeurIPS2018
- The Lottery Ticket Hypothesis: Finding Small, Trainable Neural Networks [pdf] [pdf with comments] [summary]
- Jonathan Frankle, Michael Carbin
2018-03-09, ICLR2019
- Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes [pdf] [pdf with comments] [summary]
- Roman Novak, Lechao Xiao, Jaehoon Lee, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein
2018-10-11, ICLR2019
- Gaussian Process Behaviour in Wide Deep Neural Networks [pdf] [pdf with comments] [summary]
- Alexander G. de G. Matthews, Mark Rowland, Jiri Hron, Richard E. Turner, Zoubin Ghahramani
2018-08-16, ICLR2018
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images [pdf] [code] [pdf with comments]
- Rewon Child
2020-11-20, ICLR 2021
- [VAEs]
- VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models [pdf] [pdf with comments]
- Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
2020-10-01, ICLR 2021
- [Energy-Based Models] [VAEs]
- Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [pdf] [code] [pdf with comments] [comments]
- Tian Han, Erik Nijkamp, Linqi Zhou, Bo Pang, Song-Chun Zhu, Ying Nian Wu
2020-06-10, CVPR 2020
- [VAEs] [Energy-Based Models]
- A Contrastive Divergence for Combining Variational Inference and MCMC [pdf] [code] [slides] [pdf with comments] [comments]
- Francisco J. R. Ruiz, Michalis K. Titsias
2019-05-10, ICML 2019
- [VAEs]
- Z-Forcing: Training Stochastic Recurrent Networks [pdf] [code] [pdf with comments] [comments]
- Anirudh Goyal, Alessandro Sordoni, Marc-Alexandre Côté, Nan Rosemary Ke, Yoshua Bengio
2017-11-15, NeurIPS 2017
- [VAEs] [Sequence Modeling]
- Deep Latent Variable Models for Sequential Data [pdf] [pdf with comments] [comments]
- Marco Fraccaro
2018-04-13, PhD Thesis
- Auto-Encoding Variational Bayes [pdf] [pdf with comments] [comments]
- Diederik P Kingma, Max Welling
2014-05-01, ICLR2014
- Coupled Variational Bayes via Optimization Embedding [pdf] [poster] [code] [pdf with comments] [comments]
- Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
NeurIPS2018
- Normalizing Flows: An Introduction and Review of Current Methods [pdf] [pdf with comments] [comments]
- Ivan Kobyzev, Simon Prince, Marcus A. Brubaker
2019-08-25
- [Normalizing Flows]
- Flow Contrastive Estimation of Energy-Based Models [pdf] [pdf with comments] [comments]
- Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, Ying Nian Wu
2019-12-02, CVPR 2020
- [Energy-Based Models] [Normalizing Flows]
- Deep Latent Variable Models for Sequential Data [pdf] [pdf with comments] [comments]
- Marco Fraccaro
2018-04-13, PhD Thesis
- Improving Variational Inference with Inverse Autoregressive Flow [pdf] [code] [pdf with comments] [comments]
- Diederik P. Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, Max Welling
2016-06-15, NeurIPS2016
- Variational Inference with Normalizing Flows [pdf] [pdf with comments] [comments]
- Danilo Jimenez Rezende, Shakir Mohamed
2015-05-21, ICML2015
- Neural Autoregressive Flows [pdf] [pdf with comments] [summary]
- Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
2018-04-03, ICML2018
- Part-A^2 Net: 3D Part-Aware and Aggregation Neural Network for Object Detection from Point Cloud [pdf] [pdf with comments] [comments]
- Shaoshuai Shi, Zhe Wang, Xiaogang Wang, Hongsheng Li
2019-07-08
- PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud [pdf] [code] [pdf with comments] [comments]
- Shaoshuai Shi, Xiaogang Wang, Hongsheng Li
2018-12-11, CVPR2019
- Objects as Points [pdf] [code] [pdf with comments] [comments]
- Xingyi Zhou, Dequan Wang, Philipp Krähenbühl
2019-04-16
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- Relaxed Softmax: Efficient Confidence Auto-Calibration for Safe Pedestrian Detection [pdf] [poster] [pdf with comments] [summary]
- Lukas Neumann, Andrew Zisserman, Andrea Vedaldi
2018-11-29, NeurIPS2018 Workshop
- Evaluating Bayesian Deep Learning Methods for Semantic Segmentation [pdf] [pdf with comments] [summary]
- Jishnu Mukhoti, Yarin Gal
2018-11-30
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- Uncertainty Estimates and Multi-Hypotheses Networks for Optical Flow [pdf] [pdf with comments] [summary]
- Eddy Ilg, Özgün Çiçek, Silvio Galesso, Aaron Klein, Osama Makansi, Frank Hutter, Thomas Brox
2018-08-06, ECCV2018
- Towards Safe Autonomous Driving: Capture Uncertainty in the Deep Neural Network For Lidar 3D Vehicle Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Klaus Dietmayer
2018-09-08, ITSC2018
- VoxelNet: End-to-End Learning for Point Cloud Based 3D Object Detection [pdf] [pdf with comments] [summary]
- Yin Zhou, Oncel Tuzel
2017-11-17, CVPR2018
- PIXOR: Real-time 3D Object Detection from Point Clouds [pdf] [pdf with comments] [summary]
- Bin Yang, Wenjie Luo, Raquel Urtasun
CVPR2018
- Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Fabian Timm, Klaus Dietmayer
2018-09-14
- What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision? [pdf] [pdf with comments] [summary]
- Alex Kendall, Yarin Gal
2017-10-05, NeurIPS2017
- A Deep Bayesian Neural Network for Cardiac Arrhythmia Classification with Rejection from ECG Recordings [pdf] [code] [annotated pdf]
- Wenrui Zhang, Xinxin Di, Guodong Wei, Shijia Geng, Zhaoji Fu, Shenda Hong
2022-02-26
- [Uncertainty Estimation] [Medical ML]
Somewhat interesting paper. They use a softmax model with MC-dropout to compute uncertainty estimates. The evaluation is not very extensive, they mostly just check that the classification accuracy improves as they reject more and more samples based on a uncertainty threshold.
- Confidence-based Out-of-Distribution Detection: A Comparative Study and Analysis [pdf] [code] [annotated pdf]
- Christoph Berger, Magdalini Paschali, Ben Glocker, Konstantinos Kamnitsas
2021-07-06, MICCAI Workshops 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and well-written paper. Interesting that Mahalanobis works very well on the CIFAR10 vs SVHN but not on the medical imaging dataset. I don't quite get how/why the ODIN method works, I'll probably have to read that paper.
- UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography [pdf] [annotated pdf]
- Francisca Vasconcelos, Bobby He, Nalini Singh, Yee Whye Teh
2022-02-22
- [Implicit Neural Representations] [Uncertainty Estimation] [Medical ML]
Interesting and well-written paper. I wasn't very familiar with CT image reconstruction, but they do a good job explaining everything. Interesting that MC-dropout seems important for getting well-calibrated predictions.
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions [pdf] [annotated pdf]
- Abhijit Guha Roy, Jie Ren, Shekoofeh Azizi, Aaron Loh, Vivek Natarajan, Basil Mustafa, Nick Pawlowski, Jan Freyberg, Yuan Liu, Zach Beaver, Nam Vo, Peggy Bui, Samantha Winter, Patricia MacWilliams, Greg S. Corrado, Umesh Telang, Yun Liu, Taylan Cemgil, Alan Karthikesalingam, Balaji Lakshminarayanan, Jim Winkens
2021-04-08, Medical Image Analysis (January 2022)
- [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. Quite long, so it took a bit longer than usual to read it. Section 1 and 2 gives a great overview of OOD detection in general, and how it can be used specifically in this dermatology setting. I can definitely recommend reading Section 2 (Related work). They assume access to some outlier data during training, so their approach is similar to the "Outlier exposure" method (specifically in this dermatology setting, they say that this is a fair assumption). Their method is an improvement of the "reject bucket" (add an extra class which you assign to all outlier training data points), in their proposed method they also use fine-grained classification of the outlier skin conditions. Then they also use an ensemble of 5 models, and also a more diverse ensemble (in which they combine models trained with different representation learning techniques). This diverse ensemble obtains the best performance.
- Benchmarking Uncertainty Quantification on Biosignal Classification Tasks under Dataset Shift [pdf] [annotated pdf]
- Tong Xia, Jing Han, Cecilia Mascolo
2021-12-16, AAAI Workshops 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. They synthetically create dataset shifts (e.g. by adding Gaussian noise to the data) of increasing intensity and study whether or not the uncertainty increases as the accuracy degrades. They compare regular softmax, temperature scaling, MC-dropout, ensembling and a simple variational inference method. Their conclusion is basically that ensembling slightly outperforms the other methods, but that no method performs overly well. I think these type of studies are really useful.
- Reliable and Trustworthy Machine Learning for Health Using Dataset Shift Detection [pdf] [annotated pdf]
- Chunjong Park, Anas Awadalla, Tadayoshi Kohno, Shwetak Patel
2021-10-26, NeurIPS 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and very well-written paper. Gives a good overview of the field and contains a lot of seemingly useful references. The evaluation is very comprehensive. The user study is quite neat.
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- End-to-End Object Detection with Transformers [pdf] [code] [pdf with comments] [comments]
- Nicolas Carion, Francisco Massa, Gabriel Synnaeve, Nicolas Usunier, Alexander Kirillov, Sergey Zagoruyko
2020-05-26, ECCV 2020
- [Object Detection]
- Objects as Points [pdf] [code] [pdf with comments] [comments]
- Xingyi Zhou, Dequan Wang, Philipp Krähenbühl
2019-04-16
- Acquisition of Localization Confidence for Accurate Object Detection [pdf] [code] [oral presentation] [pdf with comments] [comments]
- Borui Jiang, Ruixuan Luo, Jiayuan Mao, Tete Xiao, Yuning Jiang
2018-07-30, ECCV2018
- Part-A^2 Net: 3D Part-Aware and Aggregation Neural Network for Object Detection from Point Cloud [pdf] [pdf with comments] [comments]
- Shaoshuai Shi, Zhe Wang, Xiaogang Wang, Hongsheng Li
2019-07-08
- PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud [pdf] [code] [pdf with comments] [comments]
- Shaoshuai Shi, Xiaogang Wang, Hongsheng Li
2018-12-11, CVPR2019
- Objects as Points [pdf] [code] [pdf with comments] [comments]
- Xingyi Zhou, Dequan Wang, Philipp Krähenbühl
2019-04-16
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- Towards Safe Autonomous Driving: Capture Uncertainty in the Deep Neural Network For Lidar 3D Vehicle Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Klaus Dietmayer
2018-09-08, ITSC2018
- VoxelNet: End-to-End Learning for Point Cloud Based 3D Object Detection [pdf] [pdf with comments] [summary]
- Yin Zhou, Oncel Tuzel
2017-11-17, CVPR2018
- PIXOR: Real-time 3D Object Detection from Point Clouds [pdf] [pdf with comments] [summary]
- Bin Yang, Wenjie Luo, Raquel Urtasun
CVPR2018
- Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection [pdf] [pdf with comments] [summary]
- Di Feng, Lars Rosenbaum, Fabian Timm, Klaus Dietmayer
2018-09-14
- Probabilistic 3D Multi-Object Tracking for Autonomous Driving [pdf] [code] [pdf with comments] [comments]
- Hsu-kuang Chiu, Antonio Prioletti, Jie Li, Jeannette Bohg
2020-01-16
- [3D Multi-Object Tracking]
- A Baseline for 3D Multi-Object Tracking [pdf] [code] [pdf with comments] [comments]
- Xinshuo Weng, Kris Kitani
2019-07-09
- [3D Multi-Object Tracking]
- Probabilistic 3D Human Shape and Pose Estimation from Multiple Unconstrained Images in the Wild [pdf] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-03-19, CVPR 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they predict a single Gaussian distribution for the pose (instead of hierarchical matrix-Fisher distributions). Also, they mainly focus on the body shape. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Synthetic Training for Accurate 3D Human Pose and Shape Estimation in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2020-09-21, BMVC 2020
- [3D Human Pose Estimation]
Well-written and farily interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they just use direct regression. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Learning Motion Priors for 4D Human Body Capture in 3D Scenes [pdf] [code] [annotated pdf]
- Siwei Zhang, Yan Zhang, Federica Bogo, Marc Pollefeys, Siyu Tang
2021-08-23, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I didn't fully understand everything though, and it feels like I probably don't know this specific setting/problem well enough to fully appreciate the paper.
- Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-10-03, ICCV 2021
- [3D Human Pose Estimation]
Well-written and very interesting paper, I enjoyed reading it. The hierarchical distribution prediction approach makes sense and consistently outperforms the independent baseline. Using matrix-Fisher distributions makes sense. The synthetic training framework and the input representation of edge-filters + 2D keypoint heatmaps are both interesting.
- We are More than Our Joints: Predicting how 3D Bodies Move [pdf] [code] [annotated pdf]
- Yan Zhang, Michael J. Black, Siyu Tang
2020-12-01, CVPR 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. The marker-based representation, instead of using skeleton joints, makes sense. The recursive projection scheme also makes sense, but seems very slow (2.27 sec/frame)? I didn't quite get all the details for their DCT representation of the latent space.
- imGHUM: Implicit Generative Models of 3D Human Shape and Articulated Pose [pdf] [code] [annotated pdf]
- Thiemo Alldieck, Hongyi Xu, Cristian Sminchisescu
2021-08-24, ICCV 2021
- [3D Human Pose Estimation] [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it. Interesting combination of implicit representations and 3D human modelling. The "inclusive human modelling" application is neat and important.
- Contextually Plausible and Diverse 3D Human Motion Prediction [pdf] [annotated pdf]
- Sadegh Aliakbarian, Fatemeh Sadat Saleh, Lars Petersson, Stephen Gould, Mathieu Salzmann
2019-12-18, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The main idea, using a learned conditional prior p(z|c) instead of just p(z), makes sense and was shown beneficial also in "HuMoR: 3D Human Motion Model for Robust Pose Estimation". I'm however somewhat confused by their specific implementation in Section 4, doesn't seem like a standard cVAE implementation?
- Encoder-decoder with Multi-level Attention for 3D Human Shape and Pose Estimation [pdf] [code] [annotated pdf]
- Ziniu Wan, Zhengjia Li, Maoqing Tian, Jianbo Liu, Shuai Yi, Hongsheng Li
2021-09-06, ICCV 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. Quite a lot of details on the attention architecture, which I personally don't find overly interesting. The experimental results are quite impressive, but I would like to see a comparison in terms of computational cost at test-time. It sounds like their method is rather slow.
- Physics-based Human Motion Estimation and Synthesis from Videos [pdf] [annotated pdf]
- Kevin Xie, Tingwu Wang, Umar Iqbal, Yunrong Guo, Sanja Fidler, Florian Shkurti
2021-09-21, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The general idea, refining frame-by-frame pose estimates via physical constraints, intuitively makes a lot of sense. I did however find it quite difficult to understand all the details in Section 3.
- Human Pose Regression with Residual Log-likelihood Estimation [pdf] [code] [annotated pdf]
- Jiefeng Li, Siyuan Bian, Ailing Zeng, Can Wang, Bo Pang, Wentao Liu, Cewu Lu
2021-07-23, ICCV 2021
- [3D Human Pose Estimation]
Quite interesting paper, but also quite strange/confusing. I don't think the proposed method is explained particularly well, at least I found it quite difficult to properly understand what they actually are doing.
In the end it seems like they are learning a global loss function that is very similar to doing probabilistic regression with a Gauss/Laplace model of p(y|x) (with learned mean and variance)? See Figure 4 in the Appendix.
And while it's true that their performance is much better than for direct regression with an L2/L1 loss (see e.g. Table 1), they only compare with Gauss/Laplace probabilistic regression once (Table 7) and in that case the Laplace model is actually quite competitive?
- Character Controllers Using Motion VAEs [pdf] [code] [annotated pdf]
- Hung Yu Ling, Fabio Zinno, George Cheng, Michiel van de Panne
2021-03-26, SIGGRAPH 2020
- [3D Human Pose Estimation]
- Generating Multiple Hypotheses for 3D Human Pose Estimation with Mixture Density Network [pdf] [code] [annotated pdf]
- Chen Li, Gim Hee Lee
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
- Expressive Body Capture: 3D Hands, Face, and Body from a Single Image [pdf] [code] [annotated pdf]
- Georgios Pavlakos, Vasileios Choutas, Nima Ghorbani, Timo Bolkart, Ahmed A. A. Osman, Dimitrios Tzionas, Michael J. Black
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
Very well-written and quite interesting paper. Gives a good understanding of the SMPL model and the SMPLify method.
- Keep it SMPL: Automatic Estimation of 3D Human Pose and Shape from a Single Image [pdf] [annotated pdf]
- Federica Bogo, Angjoo Kanazawa, Christoph Lassner, Peter Gehler, Javier Romero, Michael J. Black
2016-07-27, ECCV 2016
- [3D Human Pose Estimation]
- Beyond Static Features for Temporally Consistent 3D Human Pose and Shape from a Video [pdf] [code] [annotated pdf]
- Hongsuk Choi, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee
2020-11-17, CVPR 2021
- [3D Human Pose Estimation]
- Exemplar Fine-Tuning for 3D Human Model Fitting Towards In-the-Wild 3D Human Pose Estimation [pdf] [code] [annotated pdf]
- Hanbyul Joo, Natalia Neverova, Andrea Vedaldi
2020-04-07
- [3D Human Pose Estimation]
- Learning to Reconstruct 3D Human Pose and Shape via Model-fitting in the Loop [pdf] [code] [annotated pdf]
- Nikos Kolotouros, Georgios Pavlakos, Michael J. Black, Kostas Daniilidis
2019-09-27, ICCV 2019
- [3D Human Pose Estimation]
- A simple yet effective baseline for 3d human pose estimation [pdf] [code] [annotated pdf]
- Julieta Martinez, Rayat Hossain, Javier Romero, James J. Little
2017-05-08, ICCV 2017
- [3D Human Pose Estimation]
- Estimating Egocentric 3D Human Pose in Global Space [pdf] [annotated pdf]
- Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Christian Theobalt
2021-04-27, ICCV 2021
- [3D Human Pose Estimation]
- End-to-end Recovery of Human Shape and Pose [pdf] [code] [annotated pdf]
- Angjoo Kanazawa, Michael J. Black, David W. Jacobs, Jitendra Malik
2017-12-18, CVPR 2018
- [3D Human Pose Estimation]
- 3D Multi-bodies: Fitting Sets of Plausible 3D Human Models to Ambiguous Image Data [pdf] [annotated pdf]
- Benjamin Biggs, Sébastien Ehrhadt, Hanbyul Joo, Benjamin Graham, Andrea Vedaldi, David Novotny
2020-11-02, NeurIPS 2020
- [3D Human Pose Estimation]
- HuMoR: 3D Human Motion Model for Robust Pose Estimation [pdf] [code] [annotated pdf]
- Davis Rempe, Tolga Birdal, Aaron Hertzmann, Jimei Yang, Srinath Sridhar, Leonidas J. Guibas
2021-05-10, ICCV 2021
- [3D Human Pose Estimation]
- ATOM: Accurate Tracking by Overlap Maximization [pdf] [code] [pdf with comments] [comments]
- Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg
2018-11-19, CVPR2019
- Efficiently Modeling Long Sequences with Structured State Spaces [pdf] [code] [annotated pdf]
- Albert Gu, Karan Goel, Christopher Ré
2021-10-31, ICLR 2022
- [Sequence Modeling]
Very interesting and quite well-written paper. Kind of neat/fun to see state-space models being used. The experimental results seem very impressive!? I didn't fully understand everything in Section 3. I had to read Section 3.4 a couple of times to understand how the parameterization actually works in practice (you have H state-space models, one for each feature dimension, so that you can map a sequence of feature vectors to another sequence of feature vectors) (and you can then also have multiple such layers of state-space models, mapping sequence --> sequence --> sequence --> ....).
- Multiplicative Interactions and Where to Find Them [pdf] [pdf with comments] [comments]
- Siddhant M. Jayakumar, Jacob Menick, Wojciech M. Czarnecki, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu
2019-09-25, ICLR 2020
- [Theoretical Properties of Deep Learning] [Sequence Modeling]
- Z-Forcing: Training Stochastic Recurrent Networks [pdf] [code] [pdf with comments] [comments]
- Anirudh Goyal, Alessandro Sordoni, Marc-Alexandre Côté, Nan Rosemary Ke, Yoshua Bengio
2017-11-15, NeurIPS 2017
- [VAEs] [Sequence Modeling]
- Deep Latent Variable Models for Sequential Data [pdf] [pdf with comments] [comments]
- Marco Fraccaro
2018-04-13, PhD Thesis
- Trellis Networks for Sequence Modeling [pdf] [code] [pdf with comments] [comments]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-10-15, ICLR2019
- Attention Is All You Need [pdf] [pdf with comments] [comments]
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
2017-06-12, NeurIPS2017
- A recurrent neural network without chaos [pdf] [pdf with comments] [comments]
- Thomas Laurent, James von Brecht
2016-12-19, ICLR2017
- An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling [pdf] [code] [pdf with comments] [summary]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-04-19
- When Recurrent Models Don't Need To Be Recurrent (a.k.a. Stable Recurrent Models) [pdf] [pdf with comments] [summary]
- John Miller, Moritz Hardt
2018-05-29, ICLR2019
- Pessimistic Bootstrapping for Uncertainty-Driven Offline Reinforcement Learning [pdf] [annotated pdf]
- Chenjia Bai, Lingxiao Wang, Zhuoran Yang, Zhi-Hong Deng, Animesh Garg, Peng Liu, Zhaoran Wang
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with RL, which makes it a bit difficult for me to properly evaluate the paper's contributions. They use standard ensembles for uncertainty estimation combined with an OOD sampling regularization. I thought that the OOD sampling could be interesting, but it seems very specific to RL. I'm sure this paper is quite interesting for people doing RL, but I don't think it's overly useful for me.
- Sample Efficient Deep Reinforcement Learning via Uncertainty Estimation [pdf] [annotated pdf]
- Vincent Mai, Kaustubh Mani, Liam Paull
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with reinforcement learning, which makes it a bit difficult for me to properly evaluate the paper's contributions, but to me it seems like fairly straightforward method modifications? To use ensembles of Gaussian models (instead of ensembles of models trained using the L2 loss) makes sense. The BIV method I didn't quite get, it seems rather ad hoc? I also don't quite get exactly how it's used in equation (10), is the ensemble of Gaussian models trained _jointly_ using this loss? I don't really know if this could be useful outside of RL.
- Q-Learning in enormous action spaces via amortized approximate maximization [pdf] [annotated pdf]
- Tom Van de Wiele, David Warde-Farley, Andriy Mnih, Volodymyr Mnih
2020-01-22
- [Reinforcement Learning]
- Decomposition of Uncertainty in Bayesian Deep Learning for Efficient and Risk-sensitive Learning [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-10-19, ICML 2018
- [Uncertainty Estimation] [Reinforcement Learning]
- Uncertainty Decomposition in Bayesian Neural Networks with Latent Variables [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-06-26
- [Uncertainty Estimation] [Reinforcement Learning]
- Dream to Control: Learning Behaviors by Latent Imagination [pdf] [webpage] [pdf with comments] [comments]
- Anonymous
2019-09
- Learning Latent Dynamics for Planning from Pixels [pdf] [code] [blog] [pdf with comments] [comments]
- Danijar Hafner, Timothy Lillicrap, Ian Fischer, Ruben Villegas, David Ha, Honglak Lee, James Davidson
2018-11-12, ICML2019
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- Deep Latent Variable Models for Sequential Data [pdf] [pdf with comments] [comments]
- Marco Fraccaro
2018-04-13, PhD Thesis
- Learning nonlinear state-space models using deep autoencoders [pdf] [pdf with comments] [comments]
- Daniele Masti, Alberto Bemporad
2018, CDC2018
- On Out-of-distribution Detection with Energy-based Models [pdf] [code] [annotated pdf]
- Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
2021-07-03, ICML Workshops 2021
- [Out-of-Distribution Detection] [Energy-Based Models]
Well-written and quite interesting paper. A short paper, just 4 pages. They don't study the method from the "Energy-based Out-of-distribution Detection" paper as I had expected, but it was still a quite interesting read. The results in Section 4.2 seem interesting, especially for experiment 3, but I'm not sure that I properly understand everything.
- Energy-based Out-of-distribution Detection [pdf] [code] [annotated pdf]
- Weitang Liu, Xiaoyun Wang, John D. Owens, Yixuan Li
2020-10-08, NeurIPS 2020
- [Out-of-Distribution Detection] [Energy-Based Models]
Interesting and well-written paper. The proposed method is quite clearly explained and makes intuitive sense (at least if you're familiar with EBMs). Compared to using the softmax score, the performance does seem to improve consistently. Seems like fine-tuning on an "auxiliary outlier dataset" is required to get really good performance though, which you can't really assume to have access to in real-world problems, I suppose?
- Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling [pdf] [pdf with comments]
- Tong Che, Ruixiang Zhang, Jascha Sohl-Dickstein, Hugo Larochelle, Liam Paull, Yuan Cao, Yoshua Bengio
2020-03-12, NeurIPS 2020
- [Energy-Based Models]
- No MCMC for Me: Amortized Sampling for Fast and Stable Training of Energy-Based Models [pdf] [code] [pdf with comments]
- Will Grathwohl, Jacob Kelly, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud
2020-10-08, ICLR 2021
- [Energy-Based Models]
- VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models [pdf] [pdf with comments]
- Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
2020-10-01, ICLR 2021
- [Energy-Based Models] [VAEs]
- Denoising Diffusion Probabilistic Models [pdf] [code] [pdf with comments] [comments]
- Jonathan Ho, Ajay Jain, Pieter Abbeel
20-06-19
- [Energy-Based Models]
- Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [pdf] [code] [pdf with comments] [comments]
- Tian Han, Erik Nijkamp, Linqi Zhou, Bo Pang, Song-Chun Zhu, Ying Nian Wu
2020-06-10, CVPR 2020
- [VAEs] [Energy-Based Models]
- A Connection Between Score Matching and Denoising Autoencoders [pdf] [pdf with comments] [comments]
- Pascal Vincent
2010-12
- [Energy-Based Models]
- Estimation of Non-Normalized Statistical Models by Score Matching [pdf] [pdf with comments] [comments]
- Aapo Hyvärinen
2004-11, JMLR 6
- [Energy-Based Models]
- Generative Modeling by Estimating Gradients of the Data Distribution [pdf] [code] [poster] [pdf with comments] [comments]
- Yang Song, Stefano Ermon
2019-07-12, NeurIPS 2019
- [Energy-Based Models]
- Noise-contrastive estimation: A new estimation principle for unnormalized statistical models [pdf] [pdf with comments] [comments]
- Michael Gutmann, Aapo Hyvärinen
2009, AISTATS 2010
- [Energy-Based Models]
- Maximum Entropy Generators for Energy-Based Models [pdf] [code] [pdf with comments] [comments]
- Rithesh Kumar, Sherjil Ozair, Anirudh Goyal, Aaron Courville, Yoshua Bengio
2019-01-24
- [Energy-Based Models]
- Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One [pdf] [pdf with comments] [comments]
- Will Grathwohl, Kuan-Chieh Wang, Jörn-Henrik Jacobsen, David Duvenaud, Mohammad Norouzi, Kevin Swersky
2019-12-06, ICLR 2020
- [Energy-Based Models]
- Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency [pdf] [pdf with comments] [comments]
- Zhuang Ma, Michael Collins
2018-09-06, EMNLP 2018
- [Energy-Based Models]
- Flow Contrastive Estimation of Energy-Based Models [pdf] [pdf with comments] [comments]
- Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, Ying Nian Wu
2019-12-02, CVPR 2020
- [Energy-Based Models] [Normalizing Flows]
- On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models [pdf] [code] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Tian Han, Song-Chun Zhu, Ying Nian Wu
2019-04-29, AAAI 2020
- [Energy-Based Models]
- Implicit Generation and Generalization in Energy-Based Models [pdf] [code] [blog] [pdf with comments] [comments]
- Yilun Du, Igor Mordatch
2019-04-20, NeurIPS 2019
- [Energy-Based Models]
- Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model [pdf] [poster] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu
2019-04-22, NeurIPS 2019
- [Energy-Based Models]
- A Tutorial on Energy-Based Learning [pdf] [pdf with comments] [comments]
- Yann LeCun, Sumit Chopra, Raia Hadsell, Marc Aurelio Ranzato, Fu Jie Huang
2006-08-19
- [Energy-Based Models]
- Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling [pdf] [code] [annotated pdf]
- Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson
2021-02-25, ICML 2021
- [Uncertainty Estimation] [Ensembling]
- BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning [pdf] [code] [video] [pdf with comments] [comments]
- Yeming Wen, Dustin Tran, Jimmy Ba
2020-02-17, ICLR 2020
- [Uncertainty Estimation] [Ensembling]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Bayesian Deep Learning and a Probabilistic Perspective of Generalization [pdf] [code] [pdf with comments] [comments]
- Andrew Gordon Wilson, Pavel Izmailov
2020-02-20
- [Uncertainty Estimation] [Ensembling]
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- Uncertainty Estimates and Multi-Hypotheses Networks for Optical Flow [pdf] [pdf with comments] [summary]
- Eddy Ilg, Özgün Çiçek, Silvio Galesso, Aaron Klein, Osama Makansi, Frank Hutter, Thomas Brox
2018-08-06, ECCV2018
- Large-Scale Visual Active Learning with Deep Probabilistic Ensembles [pdf] [pdf with comments] [summary]
- Kashyap Chitta, Jose M. Alvarez, Adam Lesnikowski
2018-11-08
- Uncertainty in Neural Networks: Bayesian Ensembling [pdf] [pdf with comments] [summary]
- Tim Pearce, Mohamed Zaki, Alexandra Brintrup, Andy Neel
2018-10-12, AISTATS2019 submission
- Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles [pdf] [pdf with comments] [summary]
- Balaji Lakshminarayanan, Alexander Pritzel, Charles Blundell
2017-11-17, NeurIPS2017
- Deep Confidence: A Computationally Efficient Framework for Calculating Reliable Errors for Deep Neural Networks [pdf] [pdf with comments] [summary]
- Isidro Cortes-Ciriano, Andreas Bender
2018-09-24
- How Good is the Bayes Posterior in Deep Neural Networks Really? [pdf] [pdf with comments] [comments]
- Florian Wenzel, Kevin Roth, Bastiaan S. Veeling, Jakub Świątkowski, Linh Tran, Stephan Mandt, Jasper Snoek, Tim Salimans, Rodolphe Jenatton, Sebastian Nowozin
2020-02-06
- [Uncertainty Estimation] [Stochastic Gradient MCMC]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Stochastic Gradient Descent as Approximate Bayesian Inference [pdf] [pdf with comments] [comments]
- Stephan Mandt, Matthew D. Hoffman, David M. Blei
2017-04-13, Journal of Machine Learning Research 18 (2017)
- Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning [pdf] [code] [pdf with comments] [comments]
- Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson
2019-02-11
- Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification [pdf] [poster] [pdf with comments] [comments]
- Chunyuan Li, Andrew Stevens, Changyou Chen, Yunchen Pu, Zhe Gan, Lawrence Carin
CVPR2016
- Meta-Learning For Stochastic Gradient MCMC [pdf] [code] [slides] [pdf with comments] [summary (TODO!)]
- Wenbo Gong, Yingzhen Li, José Miguel Hernández-Lobato
2018-10-28, ICLR2019
- A Complete Recipe for Stochastic Gradient MCMC [pdf] [pdf with comments] [summary]
- Yi-An Ma, Tianqi Chen, Emily B. Fox
2015-06-15, NeurIPS2015
- Tutorial: Introduction to Stochastic Gradient Markov Chain Monte Carlo Methods [pdf] [pdf with comments]
- Changyou Chen
2016-08-10
- Stochastic Gradient Hamiltonian Monte Carlo [pdf] [pdf with comments] [summary (TODO!)]
- Tianqi Chen, Emily B. Fox, Carlos Guestrin
2014-05-12, ICML2014
- Bayesian Learning via Stochastic Gradient Langevin Dynamics [pdf] [pdf with comments] [summary (TODO!)]
- Max Welling, Yee Whye Teh
ICML2011
- Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors [pdf] [code] [pdf with comments] [comments]
- Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-an Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran
2020-05-14, ICML 2020
- [Uncertainty Estimation] [Variational Inference]
- Practical Deep Learning with Bayesian Principles [pdf] [code] [pdf with comments] [comments]
- Kazuki Osawa, Siddharth Swaroop, Anirudh Jain, Runa Eschenhagen, Richard E. Turner, Rio Yokota, Mohammad Emtiyaz Khan
2019-06-06, NeurIPS 2019
- [Uncertainty Estimation] [Variational Inference]
- Noisy Natural Gradient as Variational Inference [pdf] [video] [code] [pdf with comments] [comments]
- Guodong Zhang, Shengyang Sun, David Duvenaud, Roger Grosse
2017-12-06, ICML2018
- Practical Variational Inference for Neural Networks [pdf] [pdf with comments] [comments]
- Alex Graves
NeurIPS2011
- Weight Uncertainty in Neural Networks [pdf] [pdf with comments] [comments]
- Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra
2015-05-20, ICML2015
- PixelTransformer: Sample Conditioned Signal Generation [pdf] [code] [annotated pdf]
- Shubham Tulsiani, Abhinav Gupta
2021-03-29, ICML 2021
- [Neural Processes] [Transformers]
- Convolutional Conditional Neural Processes [pdf] [code] [pdf with comments] [comments]
- Jonathan Gordon, Wessel P. Bruinsma, Andrew Y. K. Foong, James Requeima, Yann Dubois, Richard E. Turner
2019-10-29, ICLR 2020
- [Neural Processes]
- Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S.M. Ali Eslami, Yee Whye Teh
2018-07-04, ICML2018 Workshop
- Conditional Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, S. M. Ali Eslami
2018-07-04, ICML2018
- Stiff Neural Ordinary Differential Equations [pdf] [annotated pdf]
- Suyong Kim, Weiqi Ji, Sili Deng, Yingbo Ma, Christopher Rackauckas
2021-03-29
- [Neural ODEs]
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations [pdf] [pdf with comments]
- Winnie Xu, Ricky T.Q. Chen, Xuechen Li, David Duvenaud
2021-02-12
- [Neural ODEs] [Uncertainty Estimation]
- Score-Based Generative Modeling through Stochastic Differential Equations [pdf] [code] [pdf with comments]
- Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
2020-11-26, ICLR 2021
- [Neural ODEs]
- Dissecting Neural ODEs [pdf] [pdf with comments]
- Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-02-19, NeurIPS 2020
- [Neural ODEs]
- Neural Ordinary Differential Equations [pdf] [code] [slides] [pdf with comments] [summary]
- Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud
2018-10-22, NeurIPS2018
- [Neural ODEs]
- Transformers Can Do Bayesian Inference [pdf] [code] [annotated pdf]
- Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
2021-12-20, ICLR 2022
- [Transformers]
Quite interesting and well-written paper. I did however find it difficult to properly understand everything, it feels like a lot of details are omitted (I wouldn't really know how to actually implement this in practice). It's difficult for me to judge how impressive the results are or how practically useful this approach actually might be, what limitations are there? Overall though, it does indeed seem quite interesting.
- PixelTransformer: Sample Conditioned Signal Generation [pdf] [code] [annotated pdf]
- Shubham Tulsiani, Abhinav Gupta
2021-03-29, ICML 2021
- [Neural Processes] [Transformers]
- Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention [pdf] [pdf with comments]
- Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, François Fleuret
2020-06-29, ICML 2020
- [Transformers]
- Rethinking Attention with Performers [pdf] [pdf with comments]
- Krzysztof Choromanski, Valerii Likhosherstov, David Dohan, Xingyou Song, Andreea Gane, Tamas Sarlos, Peter Hawkins, Jared Davis, Afroz Mohiuddin, Lukasz Kaiser, David Belanger, Lucy Colwell, Adrian Weller
2020-10-30, ICLR 2021
- [Transformers]
- UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography [pdf] [annotated pdf]
- Francisca Vasconcelos, Bobby He, Nalini Singh, Yee Whye Teh
2022-02-22
- [Implicit Neural Representations] [Uncertainty Estimation] [Medical ML]
Interesting and well-written paper. I wasn't very familiar with CT image reconstruction, but they do a good job explaining everything. Interesting that MC-dropout seems important for getting well-calibrated predictions.
- Neural Unsigned Distance Fields for Implicit Function Learning [pdf] [code] [annotated pdf]
- Julian Chibane, Aymen Mir, Gerard Pons-Moll
2020-10-26, NeurIPS 2020
- [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it! The paper also gives a good understanding of neural implicit representations in general.
- imGHUM: Implicit Generative Models of 3D Human Shape and Articulated Pose [pdf] [code] [annotated pdf]
- Thiemo Alldieck, Hongyi Xu, Cristian Sminchisescu
2021-08-24, ICCV 2021
- [3D Human Pose Estimation] [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it. Interesting combination of implicit representations and 3D human modelling. The "inclusive human modelling" application is neat and important.
- DI-Fusion: Online Implicit 3D Reconstruction with Deep Priors [pdf] [code] [annotated pdf]
- Jiahui Huang, Shi-Sheng Huang, Haoxuan Song, Shi-Min Hu
2020-12-10, CVPR 2021
- [Implicit Neural Representations]
Well-written and interesting paper, I enjoyed reading it. Neat application of implicit representations. The paper also gives a quite good overview of online 3D reconstruction in general.
- Local Implicit Grid Representations for 3D Scenes [pdf] [code] [annotated pdf]
- Chiyu Max Jiang, Avneesh Sud, Ameesh Makadia, Jingwei Huang, Matthias Nießner, Thomas Funkhouser
2020-03-19, CVPR 2020
- [Implicit Neural Representations]
Well-written and quite interesting paper. Interesting application, being able to reconstruct full 3D scenes from sparse point clouds. I didn't fully understand everything, as I don't have a particularly strong graphics background.
- NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis [pdf] [code] [annotated pdf]
- Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, Ren Ng
2020-03-19, ECCV 2020
- [Implicit Neural Representations]
Extremely well-written and interesting paper. I really enjoyed reading it, and I would recommend anyone interested in computer vision to read it as well.
All parts of the proposed method are clearly explained and relatively easy to understand, including the volume rendering techniques which I was unfamiliar with.
- DeepSDF: Learning Continuous Signed Distance Functions for Shape Representation [pdf] [code] [annotated pdf]
- Jeong Joon Park, Peter Florence, Julian Straub, Richard Newcombe, Steven Lovegrove
2019-01-16, CVPR 2019
- [Implicit Neural Representations]
- Implicit Gradient Regularization [pdf] [pdf with comments] [comments]
- David G.T. Barrett, Benoit Dherin
2020-09-23
- [Theoretical Properties of Deep Learning]
- Satellite Conjunction Analysis and the False Confidence Theorem [pdf] [pdf with comments] [comments]
- Michael Scott Balch, Ryan Martin, Scott Ferson
2018-03-21
- Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness [pdf] [pdf with comments] [comments]
- Jeremiah Zhe Liu, Zi Lin, Shreyas Padhy, Dustin Tran, Tania Bedrax-Weiss, Balaji Lakshminarayanan
2020-06-17, NeurIPS 2020
- [Uncertainty Estimation]
- Uncertainty Estimation Using a Single Deep Deterministic Neural Network [pdf] [code] [pdf with comments] [comments]
- Joost van Amersfoort, Lewis Smith, Yee Whye Teh, Yarin Gal
2020-03-04, ICML 2020
- [Uncertainty Estimation]
- Gated Linear Networks [pdf] [pdf with comments] [comments]
- Joel Veness, Tor Lattimore, David Budden, Avishkar Bhoopchand, Christopher Mattern, Agnieszka Grabska-Barwinska, Eren Sezener, Jianan Wang, Peter Toth, Simon Schmitt, Marcus Hutter
2020-06-11
- Denoising Diffusion Probabilistic Models [pdf] [code] [pdf with comments] [comments]
- Jonathan Ho, Ajay Jain, Pieter Abbeel
20-06-19
- [Energy-Based Models]
- Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [pdf] [code] [pdf with comments] [comments]
- Tian Han, Erik Nijkamp, Linqi Zhou, Bo Pang, Song-Chun Zhu, Ying Nian Wu
2020-06-10, CVPR 2020
- [VAEs] [Energy-Based Models]
- End-to-End Object Detection with Transformers [pdf] [code] [pdf with comments] [comments]
- Nicolas Carion, Francisco Massa, Gabriel Synnaeve, Nicolas Usunier, Alexander Kirillov, Sergey Zagoruyko
2020-05-26, ECCV 2020
- [Object Detection]
- Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors [pdf] [code] [pdf with comments] [comments]
- Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-an Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran
2020-05-14, ICML 2020
- [Uncertainty Estimation] [Variational Inference]
- BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning [pdf] [code] [video] [pdf with comments] [comments]
- Yeming Wen, Dustin Tran, Jimmy Ba
2020-02-17, ICLR 2020
- [Uncertainty Estimation] [Ensembling]
- Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One [pdf] [pdf with comments] [comments]
- Will Grathwohl, Kuan-Chieh Wang, Jörn-Henrik Jacobsen, David Duvenaud, Mohammad Norouzi, Kevin Swersky
2019-12-06, ICLR 2020
- [Energy-Based Models]
- Stable Neural Flows [pdf] [pdf with comments] [comments]
- Stefano Massaroli, Michael Poli, Michelangelo Bin, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-03-18
- How Good is the Bayes Posterior in Deep Neural Networks Really? [pdf] [pdf with comments] [comments]
- Florian Wenzel, Kevin Roth, Bastiaan S. Veeling, Jakub Świątkowski, Linh Tran, Stephan Mandt, Jasper Snoek, Tim Salimans, Rodolphe Jenatton, Sebastian Nowozin
2020-02-06
- [Uncertainty Estimation] [Stochastic Gradient MCMC]
- Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration [pdf] [code] [poster] [slides] [video] [pdf with comments] [comments]
- Meelis Kull, Miquel Perello-Nieto, Markus Kängsepp, Telmo Silva Filho, Hao Song, Peter Flach
2019-10-28, NeurIPS 2019
- [Uncertainty Estimation]
- Normalizing Flows: An Introduction and Review of Current Methods [pdf] [pdf with comments] [comments]
- Ivan Kobyzev, Simon Prince, Marcus A. Brubaker
2019-08-25
- [Normalizing Flows]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Conservative Uncertainty Estimation By Fitting Prior Networks [pdf] [pdf with comments] [comments]
- Kamil Ciosek, Vincent Fortuin, Ryota Tomioka, Katja Hofmann, Richard Turner
2019-10-25, ICLR 2020
- [Uncertainty Estimation]
- Batch Normalization Biases Deep Residual Networks Towards Shallow Paths [pdf] [pdf with comments] [comments]
- Soham De, Samuel L. Smith
2020-02-24
- [Theoretical Properties of Deep Learning]
- Bayesian Deep Learning and a Probabilistic Perspective of Generalization [pdf] [code] [pdf with comments] [comments]
- Andrew Gordon Wilson, Pavel Izmailov
2020-02-20
- [Uncertainty Estimation] [Ensembling]
- Convolutional Conditional Neural Processes [pdf] [code] [pdf with comments] [comments]
- Jonathan Gordon, Wessel P. Bruinsma, Andrew Y. K. Foong, James Requeima, Yann Dubois, Richard E. Turner
2019-10-29, ICLR 2020
- [Neural Processes]
- Decomposition of Uncertainty in Bayesian Deep Learning for Efficient and Risk-sensitive Learning [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-10-19, ICML 2018
- [Uncertainty Estimation] [Reinforcement Learning]
- Uncertainty Decomposition in Bayesian Neural Networks with Latent Variables [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-06-26
- [Uncertainty Estimation] [Reinforcement Learning]
- Modelling heterogeneous distributions with an Uncountable Mixture of Asymmetric Laplacians [pdf] [code] [video] [pdf with comments] [comments]
- Axel Brando, Jose A. Rodríguez-Serrano, Jordi Vitrià, Alberto Rubio
2019-10-27, NeurIPS 2019
- [Uncertainty Estimation]
- A Primal-Dual link between GANs and Autoencoders [pdf] [poster] [pdf with comments] [comments]
- Hisham Husain, Richard Nock, Robert C. Williamson
2019-04-26, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Multiplicative Interactions and Where to Find Them [pdf] [pdf with comments] [comments]
- Siddhant M. Jayakumar, Jacob Menick, Wojciech M. Czarnecki, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu
2019-09-25, ICLR 2020
- [Theoretical Properties of Deep Learning] [Sequence Modeling]
- Z-Forcing: Training Stochastic Recurrent Networks [pdf] [code] [pdf with comments] [comments]
- Anirudh Goyal, Alessandro Sordoni, Marc-Alexandre Côté, Nan Rosemary Ke, Yoshua Bengio
2017-11-15, NeurIPS 2017
- [VAEs] [Sequence Modeling]
- Dream to Control: Learning Behaviors by Latent Imagination [pdf] [webpage] [pdf with comments] [comments]
- Anonymous
2019-09
- Learning Latent Dynamics for Planning from Pixels [pdf] [code] [blog] [pdf with comments] [comments]
- Danijar Hafner, Timothy Lillicrap, Ian Fischer, Ruben Villegas, David Ha, Honglak Lee, James Davidson
2018-11-12, ICML2019
- Learning nonlinear state-space models using deep autoencoders [pdf] [pdf with comments] [comments]
- Daniele Masti, Alberto Bemporad
2018, CDC2018
- Weight Uncertainty in Neural Networks [pdf] [pdf with comments] [comments]
- Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra
2015-05-20, ICML2015
- Neural Autoregressive Flows [pdf] [pdf with comments] [summary]
- Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
2018-04-03, ICML2018
- Improving Variational Inference with Inverse Autoregressive Flow [pdf] [code] [pdf with comments] [comments]
- Diederik P. Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, Max Welling
2016-06-15, NeurIPS2016
- Variational Inference with Normalizing Flows [pdf] [pdf with comments] [comments]
- Danilo Jimenez Rezende, Shakir Mohamed
2015-05-21, ICML2015
- Trellis Networks for Sequence Modeling [pdf] [code] [pdf with comments] [comments]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-10-15, ICLR2019
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- Attention Is All You Need [pdf] [pdf with comments] [comments]
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
2017-06-12, NeurIPS2017
- Visualizing the Loss Landscape of Neural Nets [pdf] [code] [pdf with comments] [comments]
- Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein
2017-12-28, NeurIPS2018
- Stochastic Gradient Descent as Approximate Bayesian Inference [pdf] [pdf with comments] [comments]
- Stephan Mandt, Matthew D. Hoffman, David M. Blei
2017-04-13, Journal of Machine Learning Research 18 (2017)
- Generating High Fidelity Images with Subscale Pixel Networks and Multidimensional Upscaling [pdf] [pdf with comments] [comments]
- Jacob Menick, Nal Kalchbrenner
2018-12-04, ICLR2019
- Evaluating model calibration in classification [pdf] [code] [pdf with comments] [comments]
- Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
2019-02-19, AISTATS2019
- A recurrent neural network without chaos [pdf] [pdf with comments] [comments]
- Thomas Laurent, James von Brecht
2016-12-19, ICLR2017
- Coupled Variational Bayes via Optimization Embedding [pdf] [poster] [code] [pdf with comments] [comments]
- Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
NeurIPS2018
- Language Models are Unsupervised Multitask Learners [pdf] [blog post] [code] [pdf with comments] [comments]
- Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever
2019-02-14
- Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks [pdf] [pdf with comments] [comments]
- Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Ruosong Wang
2019-01-24
- A Simple Baseline for Bayesian Uncertainty in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson
2019-02-07
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- A Complete Recipe for Stochastic Gradient MCMC [pdf] [pdf with comments] [summary]
- Yi-An Ma, Tianqi Chen, Emily B. Fox
2015-06-15, NeurIPS2015
- An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling [pdf] [code] [pdf with comments] [summary]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-04-19
- How Does Batch Normalization Help Optimization? [pdf] [poster] [video] [pdf with comments] [summary]
- Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, Aleksander Madry
2018-10-27, NeurIPS2018
- Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S.M. Ali Eslami, Yee Whye Teh
2018-07-04, ICML2018 Workshop
- Neural Ordinary Differential Equations [pdf] [code] [slides] [pdf with comments] [summary]
- Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud
2018-10-22, NeurIPS2018
- Evidential Deep Learning to Quantify Classification Uncertainty [pdf] [poster] [code example] [pdf with comments] [summary]
- Murat Sensoy, Lance Kaplan, Melih Kandemir
2018-10-31, NeurIPS2018
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- When Recurrent Models Don't Need To Be Recurrent (a.k.a. Stable Recurrent Models) [pdf] [pdf with comments] [summary]
- John Miller, Moritz Hardt
2018-05-29, ICLR2019
- Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV) [pdf] [pdf with comments] [summary]
- Been Kim, Martin Wattenberg, Justin Gilmer, Carrie Cai, James Wexler, Fernanda Viegas, Rory Sayres
2018-06-07, ICML2018
- The Lottery Ticket Hypothesis: Finding Small, Trainable Neural Networks [pdf] [pdf with comments] [summary]
- Jonathan Frankle, Michael Carbin
2018-03-09, ICLR2019
- Conditional Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, S. M. Ali Eslami
2018-07-04, ICML2018
- Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes [pdf] [pdf with comments] [summary]
- Roman Novak, Lechao Xiao, Jaehoon Lee, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein
2018-10-11, ICLR2019
- On gradient regularizers for MMD GANs [pdf] [pdf with comments] [summary]
- Michael Arbel, Dougal J. Sutherland, Mikołaj Bińkowski, Arthur Gretton
2018-05-29, NeurIPS2018
- Neural Autoregressive Flows [pdf] [pdf with comments] [summary]
- Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
2018-04-03, ICML2018
- Gaussian Process Behaviour in Wide Deep Neural Networks [pdf] [pdf with comments] [summary]
- Alexander G. de G. Matthews, Mark Rowland, Jiri Hron, Richard E. Turner, Zoubin Ghahramani
2018-08-16, ICLR2018
- The Continuous-Discrete Time Feedback Particle Filter [pdf]
- Tao Yang, Henk A. P. Blom, Prashant G. Mehta
2014, American Control Conference
- Feedback Particle Filter [pdf]
- Tao Yang, Prashant G. Mehta, Sean P. Meyn
2013, IEEE Transactions on Automatic Control
- Markov Chains for Exploring Posterior Distributions [pdf] [pdf with comments]
- Luke Tierney
1994-12, The Annals of Statistics
- Particle Gibbs with Ancestor Sampling [pdf]
- Fredrik Lindsten, Michael I. Jordan, Thomas B. Schön
2014-06-14, Journal of Machine Learning Research
- Particle Markov chain Monte Carlo methods [pdf]
- Christophe Andrieu, Arnaud Doucet, Roman Holenstein
2010, Journal of the Royal Statistical Society
- State Space LSTM Models with Particle MCMC Inference [pdf]
- Xun Zheng, Manzil Zaheer, Amr Ahmed, Yuan Wang, Eric P Xing, Alexander J Smola
2017-11-30
- Rethinking the Effective Sample Size [pdf]
- Víctor Elvira, Luca Martino, Christian P. Robert
- `2018-09-11,
- NeurIPS 2021
- NeurIPS 2020
- NeurIPS 2019
- NeurIPS 2018
- NeurIPS 2017
- NeurIPS 2016
- NeurIPS 2015
- NeurIPS 2011
- Deep Learning Through the Lens of Example Difficulty [pdf] [annotated pdf]
- Robert John Nicholas Baldock, Hartmut Maennel, Behnam Neyshabur
2021-05-21, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite interesting and well-written paper. The definition of "prediction depth" in Section 2.1 makes sense, and it definitely seems reasonable that this could correlate with example difficulty / prediction confidence in some way. Section 3 and 4, and all the figures, contain a lot of info it seems, I'd probably need to read the paper again to properly understand/appreciate everything.
- Laplace Redux -- Effortless Bayesian Deep Learning [pdf] [code] [annotated pdf]
- Erik Daxberger, Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Matthias Bauer, Philipp Hennig
2021-06-28, NeurIPS 2021
- [Uncertainty Estimation]
Interesting and very well-written paper, I enjoyed reading it. I still think that ensembling probably is quite difficult to beat purely in terms of uncertainty estimation quality, but this definitely seems like a useful tool in many situations. It's not clear to me if the analytical expression for regression in "4. Approximate Predictive Distribution" is applicable also if the variance is input-dependent?
- Periodic Activation Functions Induce Stationarity [pdf] [code] [annotated pdf]
- Lassi Meronen, Martin Trapp, Arno Solin
2021-10-26, NeurIPS 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Quite a heavy read, probably need to be rather familiar with GPs to properly understand/appreciate everything. Definitely check Appendix D, it gives a better understanding of how the proposed method is applied in practice. I'm not quite sure how strong/impressive the experimental results actually are. Also seems like the method could be a bit inconvenient to implement/use?
- Reliable and Trustworthy Machine Learning for Health Using Dataset Shift Detection [pdf] [annotated pdf]
- Chunjong Park, Anas Awadalla, Tadayoshi Kohno, Shwetak Patel
2021-10-26, NeurIPS 2021
- [Out-of-Distribution Detection]
Interesting and very well-written paper. Gives a good overview of the field and contains a lot of seemingly useful references. The evaluation is very comprehensive. The user study is quite neat.
- An Information-theoretic Approach to Distribution Shifts [pdf] [code] [annotated pdf]
- Marco Federici, Ryota Tomioka, Patrick Forré
2021-06-07, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite well-written paper overall that seemed interesting, but I found it very difficult to properly understand everything. Thus, I can't really tell how interesting/significant their analysis actually is.
- On the Importance of Gradients for Detecting Distributional Shifts in the Wild [pdf] [code] [annotated pdf]
- Rui Huang, Andrew Geng, Yixuan Li
2021-10-01, NeurIPS 2021
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. The experimental results do seem promising. However, I don't quite get why the proposed method intuitively makes sense, why is it better to only use the parameters of the final network layer?
- Revisiting the Calibration of Modern Neural Networks [pdf] [code] [annotated pdf]
- Matthias Minderer, Josip Djolonga, Rob Romijnders, Frances Hubis, Xiaohua Zhai, Neil Houlsby, Dustin Tran, Mario Lucic
2021-06-15, NeurIPS 2021
- [Uncertainty Estimation]
Well-written paper. Everything is quite clearly explained and easy to understand. Quite enjoyable to read overall.
Thorough experimental evaluation. Quite interesting findings.
- Deep Evidential Regression [pdf] [code] [annotated pdf]
- Alexander Amini, Wilko Schwarting, Ava Soleimany, Daniela Rus
2019-10-07, NeurIPS 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. This is a good paper to read before "Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions". Their proposed method seems to have similar / slightly worse performance than a small ensemble, so the only real advantage is that it's faster at time-time? This is of course very important in many applications, but not in all. The performance also seems quite sensitive to the choice of lambda in the combined loss function (Equation (10)), according to Figure S2 in the appendix?
- Energy-based Out-of-distribution Detection [pdf] [code] [annotated pdf]
- Weitang Liu, Xiaoyun Wang, John D. Owens, Yixuan Li
2020-10-08, NeurIPS 2020
- [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method is quite clearly explained and makes intuitive sense (at least if you're familiar with EBMs). Compared to using the softmax score, the performance does seem to improve consistently. Seems like fine-tuning on an "auxiliary outlier dataset" is required to get really good performance though, which you can't really assume to have access to in real-world problems, I suppose?
- Neural Unsigned Distance Fields for Implicit Function Learning [pdf] [code] [annotated pdf]
- Julian Chibane, Aymen Mir, Gerard Pons-Moll
2020-10-26, NeurIPS 2020
- [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it! The paper also gives a good understanding of neural implicit representations in general.
- 3D Multi-bodies: Fitting Sets of Plausible 3D Human Models to Ambiguous Image Data [pdf] [annotated pdf]
- Benjamin Biggs, Sébastien Ehrhadt, Hanbyul Joo, Benjamin Graham, Andrea Vedaldi, David Novotny
2020-11-02, NeurIPS 2020
- [3D Human Pose Estimation]
- Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling [pdf] [pdf with comments]
- Tong Che, Ruixiang Zhang, Jascha Sohl-Dickstein, Hugo Larochelle, Liam Paull, Yuan Cao, Yoshua Bengio
2020-03-12, NeurIPS 2020
- [Energy-Based Models]
- Unsupervised Learning of Visual Features by Contrasting Cluster Assignments [pdf] [code] [pdf with comments]
- Mathilde Caron, Ishan Misra, Julien Mairal, Priya Goyal, Piotr Bojanowski, Armand Joulin
2020-06-17, NeurIPS 2020
- Dissecting Neural ODEs [pdf] [pdf with comments]
- Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-02-19, NeurIPS 2020
- Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness [pdf] [pdf with comments] [comments]
- Jeremiah Zhe Liu, Zi Lin, Shreyas Padhy, Dustin Tran, Tania Bedrax-Weiss, Balaji Lakshminarayanan
2020-06-17, NeurIPS 2020
- [Uncertainty Estimation]
- Approximate Inference Turns Deep Networks into Gaussian Processes [pdf] [pdf with comments]
- Mohammad Emtiyaz Khan, Alexander Immer, Ehsan Abedi, Maciej Korzepa
2019-06-05, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration [pdf] [code] [poster] [slides] [video] [pdf with comments] [comments]
- Meelis Kull, Miquel Perello-Nieto, Markus Kängsepp, Telmo Silva Filho, Hao Song, Peter Flach
2019-10-28, NeurIPS 2019
- [Uncertainty Estimation]
- Modelling heterogeneous distributions with an Uncountable Mixture of Asymmetric Laplacians [pdf] [code] [video] [pdf with comments] [comments]
- Axel Brando, Jose A. Rodríguez-Serrano, Jordi Vitrià, Alberto Rubio
2019-10-27, NeurIPS 2019
- [Uncertainty Estimation]
- A Primal-Dual link between GANs and Autoencoders [pdf] [poster] [pdf with comments] [comments]
- Hisham Husain, Richard Nock, Robert C. Williamson
2019-04-26, NeurIPS 2019
- [Theoretical Properties of Deep Learning]
- Generative Modeling by Estimating Gradients of the Data Distribution [pdf] [code] [poster] [pdf with comments] [comments]
- Yang Song, Stefano Ermon
2019-07-12, NeurIPS 2019
- [Energy-Based Models]
- Practical Deep Learning with Bayesian Principles [pdf] [code] [pdf with comments] [comments]
- Kazuki Osawa, Siddharth Swaroop, Anirudh Jain, Runa Eschenhagen, Richard E. Turner, Rio Yokota, Mohammad Emtiyaz Khan
2019-06-06, NeurIPS 2019
- [Uncertainty Estimation] [Variational Inference]
- Implicit Generation and Generalization in Energy-Based Models [pdf] [code] [blog] [pdf with comments] [comments]
- Yilun Du, Igor Mordatch
2019-04-20, NeurIPS 2019
- [Energy-Based Models]
- Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model [pdf] [poster] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu
2019-04-22, NeurIPS 2019
- [Energy-Based Models]
- A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks [pdf] [code] [annotated pdf]
- Kimin Lee, Kibok Lee, Honglak Lee, Jinwoo Shin
2018-07-10, NeurIPS 2018
- [Out-of-Distribution Detection]
Well-written and interesting paper. The proposed method is simple and really neat: fit class-conditional Gaussians in the feature space of a pre-trained classifier (basically just LDA on the feature vectors), and then use the Mahalanobis distance to these Gaussians as the confidence score for input x. They then also do this for the features at multiple levels of the network and combine these confidence scores into one. I don't quite get why the "input pre-processing" in Section 2.2 (adding noise to test samples) works, in Table 1 it significantly improves the performance.
- Coupled Variational Bayes via Optimization Embedding [pdf] [poster] [code] [pdf with comments] [comments]
- Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song
NeurIPS2018
- Predictive Uncertainty Estimation via Prior Networks [pdf] [pdf with comments] [comments]
- Andrey Malinin, Mark Gales
2018-02-28, NeurIPS2018
- Visualizing the Loss Landscape of Neural Nets [pdf] [code] [pdf with comments] [comments]
- Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein
2017-12-28, NeurIPS2018
- Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models [pdf] [poster] [video] [code] [pdf with comments] [summary]
- Kurtland Chua, Roberto Calandra, Rowan McAllister, Sergey Levine
2018-05-30, NeurIPS2018
- How Does Batch Normalization Help Optimization? [pdf] [poster] [video] [pdf with comments] [summary]
- Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, Aleksander Madry
2018-10-27, NeurIPS2018
- Relaxed Softmax: Efficient Confidence Auto-Calibration for Safe Pedestrian Detection [pdf] [poster] [pdf with comments] [summary]
- Lukas Neumann, Andrew Zisserman, Andrea Vedaldi
2018-11-29, NeurIPS2018 Workshop
- Neural Ordinary Differential Equations [pdf] [code] [slides] [pdf with comments] [summary]
- Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, David Duvenaud
2018-10-22, NeurIPS2018
- Evidential Deep Learning to Quantify Classification Uncertainty [pdf] [poster] [code example] [pdf with comments] [summary]
- Murat Sensoy, Lance Kaplan, Melih Kandemir
2018-10-31, NeurIPS2018
- A Probabilistic U-Net for Segmentation of Ambiguous Images [pdf] [code] [pdf with comments] [summary]
- Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger
2018-10-29, NeurIPS2018
- On gradient regularizers for MMD GANs [pdf] [pdf with comments] [summary]
- Michael Arbel, Dougal J. Sutherland, Mikołaj Bińkowski, Arthur Gretton
2018-05-29, NeurIPS2018
- Z-Forcing: Training Stochastic Recurrent Networks [pdf] [code] [pdf with comments] [comments]
- Anirudh Goyal, Alessandro Sordoni, Marc-Alexandre Côté, Nan Rosemary Ke, Yoshua Bengio
2017-11-15, NeurIPS 2017
- [VAEs] [Sequence Modeling]
- Attention Is All You Need [pdf] [pdf with comments] [comments]
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
2017-06-12, NeurIPS2017
- Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles [pdf] [pdf with comments] [summary]
- Balaji Lakshminarayanan, Alexander Pritzel, Charles Blundell
2017-11-17, NeurIPS2017
- What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision? [pdf] [pdf with comments] [summary]
- Alex Kendall, Yarin Gal
2017-10-05, NeurIPS2017
- Improving Variational Inference with Inverse Autoregressive Flow [pdf] [code] [pdf with comments] [comments]
- Diederik P. Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, Max Welling
2016-06-15, NeurIPS2016
- Bayesian Dark Knowledge [pdf] [pdf with comments] [comments]
- Anoop Korattikara, Vivek Rathod, Kevin Murphy, Max Welling
2015-06-07, NeurIPS2015
- A Complete Recipe for Stochastic Gradient MCMC [pdf] [pdf with comments] [summary]
- Yi-An Ma, Tianqi Chen, Emily B. Fox
2015-06-15, NeurIPS2015
- Practical Variational Inference for Neural Networks [pdf] [pdf with comments] [comments]
- Alex Graves
NeurIPS2011
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Differentiable Particle Filtering via Entropy-Regularized Optimal Transport [pdf] [code] [annotated pdf]
- Adrien Corenflos, James Thornton, George Deligiannidis, Arnaud Doucet
2021-02-15, ICML 2021
- PixelTransformer: Sample Conditioned Signal Generation [pdf] [code] [annotated pdf]
- Shubham Tulsiani, Abhinav Gupta
2021-03-29, ICML 2021
- [Neural Processes] [Transformers]
- Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling [pdf] [code] [annotated pdf]
- Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson
2021-02-25, ICML 2021
- [Uncertainty Estimation] [Ensembling]
- Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision [pdf] [pdf with comments]
- Chao Jia, Yinfei Yang, Ye Xia, Yi-Ting Chen, Zarana Parekh, Hieu Pham, Quoc V. Le, Yunhsuan Sung, Zhen Li, Tom Duerig
2021-02-11, ICML 2021
- Learning to Simulate Complex Physics with Graph Networks [pdf] [code] [annotated pdf]
- Alvaro Sanchez-Gonzalez, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, Peter W. Battaglia
2020-02-21, ICML 2020
Quite well-written and somewhat interesting paper. Cool application and a bunch of neat videos. This is not really my area, so I didn't spend too much time/energy trying to fully understand everything.
- Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention [pdf] [pdf with comments]
- Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, François Fleuret
2020-06-29, ICML 2020
- [Transformers]
- Uncertainty Estimation Using a Single Deep Deterministic Neural Network [pdf] [code] [pdf with comments] [comments]
- Joost van Amersfoort, Lewis Smith, Yee Whye Teh, Yarin Gal
2020-03-04, ICML 2020
- [Uncertainty Estimation]
- Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors [pdf] [code] [pdf with comments] [comments]
- Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-an Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran
2020-05-14, ICML 2020
- [Uncertainty Estimation] [Variational Inference]
- A Contrastive Divergence for Combining Variational Inference and MCMC [pdf] [code] [slides] [pdf with comments] [comments]
- Francisco J. R. Ruiz, Michalis K. Titsias
2019-05-10, ICML 2019
- [VAEs]
- Learning Latent Dynamics for Planning from Pixels [pdf] [code] [blog] [pdf with comments] [comments]
- Danijar Hafner, Timothy Lillicrap, Ian Fischer, Ruben Villegas, David Ha, Honglak Lee, James Davidson
2018-11-12, ICML2019
- Neural Relational Inference for Interacting Systems [pdf] [code] [pdf with comments]
- Thomas Kipf, Ethan Fetaya, Kuan-Chieh Wang, Max Welling, Richard Zemel
2018-02-13, ICML 2018
- Decomposition of Uncertainty in Bayesian Deep Learning for Efficient and Risk-sensitive Learning [pdf] [pdf with comments] [comments]
- Stefan Depeweg, José Miguel Hernández-Lobato, Finale Doshi-Velez, Steffen Udluft
2017-10-19, ICML 2018
- [Uncertainty Estimation] [Reinforcement Learning]
- Noisy Natural Gradient as Variational Inference [pdf] [video] [code] [pdf with comments] [comments]
- Guodong Zhang, Shengyang Sun, David Duvenaud, Roger Grosse
2017-12-06, ICML2018
- Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV) [pdf] [pdf with comments] [summary]
- Been Kim, Martin Wattenberg, Justin Gilmer, Carrie Cai, James Wexler, Fernanda Viegas, Rory Sayres
2018-06-07, ICML2018
- Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors [pdf] [pdf with comments] [summary]
- Danijar Hafner, Dustin Tran, Alex Irpan, Timothy Lillicrap, James Davidson
2018-07-24, ICML2018 Workshop
- Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S.M. Ali Eslami, Yee Whye Teh
2018-07-04, ICML2018 Workshop
- Conditional Neural Processes [pdf] [pdf with comments] [summary]
- Marta Garnelo, Dan Rosenbaum, Chris J. Maddison, Tiago Ramalho, David Saxton, Murray Shanahan, Yee Whye Teh, Danilo J. Rezende, S. M. Ali Eslami
2018-07-04, ICML2018
- Neural Autoregressive Flows [pdf] [pdf with comments] [summary]
- Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville
2018-04-03, ICML2018
- On Calibration of Modern Neural Networks [pdf] [code] [pdf with comments] [summary]
- Chuan Guo, Geoff Pleiss, Yu Sun, Kilian Q. Weinberger
2017-08-03, ICML2017
- Variational Inference with Normalizing Flows [pdf] [pdf with comments] [comments]
- Danilo Jimenez Rezende, Shakir Mohamed
2015-05-21, ICML2015
- Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks [pdf] [pdf with comments] [comments]
- José Miguel Hernández-Lobato, Ryan P. Adams
2015-07-15, ICML2015
- Weight Uncertainty in Neural Networks [pdf] [pdf with comments] [comments]
- Charles Blundell, Julien Cornebise, Koray Kavukcuoglu, Daan Wierstra
2015-05-20, ICML2015
- Stochastic Gradient Hamiltonian Monte Carlo [pdf] [pdf with comments] [summary (TODO!)]
- Tianqi Chen, Emily B. Fox, Carlos Guestrin
2014-05-12, ICML2014
- Bayesian Learning via Stochastic Gradient Langevin Dynamics [pdf] [pdf with comments] [summary (TODO!)]
- Max Welling, Yee Whye Teh
ICML2011
- Transformers Can Do Bayesian Inference [pdf] [code] [annotated pdf]
- Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
2021-12-20, ICLR 2022
- [Transformers]
Quite interesting and well-written paper. I did however find it difficult to properly understand everything, it feels like a lot of details are omitted (I wouldn't really know how to actually implement this in practice). It's difficult for me to judge how impressive the results are or how practically useful this approach actually might be, what limitations are there? Overall though, it does indeed seem quite interesting.
- Pessimistic Bootstrapping for Uncertainty-Driven Offline Reinforcement Learning [pdf] [annotated pdf]
- Chenjia Bai, Lingxiao Wang, Zhuoran Yang, Zhi-Hong Deng, Animesh Garg, Peng Liu, Zhaoran Wang
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with RL, which makes it a bit difficult for me to properly evaluate the paper's contributions. They use standard ensembles for uncertainty estimation combined with an OOD sampling regularization. I thought that the OOD sampling could be interesting, but it seems very specific to RL. I'm sure this paper is quite interesting for people doing RL, but I don't think it's overly useful for me.
- On the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks [pdf] [code] [annotated pdf]
- Maximilian Seitzer, Arash Tavakoli, Dimitrije Antic, Georg Martius
2021-09-29, ICLR 2022
- [Uncertainty Estimation]
Quite interesting and very well-written paper, I enjoyed reading it. Their analysis of fitting Gaussian regression models via the NLL is quite interesting, I didn't really expect to learn something new about this. I've seen Gaussian models outperform standard regression (L2 loss) w.r.t. accuracy in some applications/datasets, and it being the other way around in others. In the first case, I've then attributed the success of the Gaussian model to the "learned loss attenuation". The analysis in this paper could perhaps explain why you get this performance boost only in certain applications. Their beta-NLL loss could probably be quite useful, seems like a convenient tool to have.
- Sample Efficient Deep Reinforcement Learning via Uncertainty Estimation [pdf] [annotated pdf]
- Vincent Mai, Kaustubh Mani, Liam Paull
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with reinforcement learning, which makes it a bit difficult for me to properly evaluate the paper's contributions, but to me it seems like fairly straightforward method modifications? To use ensembles of Gaussian models (instead of ensembles of models trained using the L2 loss) makes sense. The BIV method I didn't quite get, it seems rather ad hoc? I also don't quite get exactly how it's used in equation (10), is the ensemble of Gaussian models trained _jointly_ using this loss? I don't really know if this could be useful outside of RL.
- Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions [pdf] [annotated pdf]
- Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. I didn't quite understand all the details, I'll have to read a couple of related/background papers to be able to properly appreciate and evaluate the proposed method. I definitely feel like I would like to read up on this family of methods. Extensive experimental evaluation, and the results seem promising overall.
- VOS: Learning What You Don't Know by Virtual Outlier Synthesis [pdf] [code] [annotated pdf]
- Xuefeng Du, Zhaoning Wang, Mu Cai, Yixuan Li
2022-02-02, ICLR 2022
- [Out-of-Distribution Detection]
Interesting and quite well-written paper. I did find it somewhat difficult to understand certain parts though, they could perhaps be explained more clearly. The results seem quite impressive (they do consistently outperform all baselines), but I find it interesting that the "Gaussian noise" baseline in Table 2 performs that well? I should probably have read "Energy-based Out-of-distribution Detection" before reading this paper.
- Efficiently Modeling Long Sequences with Structured State Spaces [pdf] [code] [annotated pdf]
- Albert Gu, Karan Goel, Christopher Ré
2021-10-31, ICLR 2022
- [Sequence Modeling]
Very interesting and quite well-written paper. Kind of neat/fun to see state-space models being used. The experimental results seem very impressive!? I didn't fully understand everything in Section 3. I had to read Section 3.4 a couple of times to understand how the parameterization actually works in practice (you have H state-space models, one for each feature dimension, so that you can map a sequence of feature vectors to another sequence of feature vectors) (and you can then also have multiple such layers of state-space models, mapping sequence --> sequence --> sequence --> ....).
- Learning Mesh-Based Simulation with Graph Networks [pdf] [code] [annotated pdf]
- Tobias Pfaff, Meire Fortunato, Alvaro Sanchez-Gonzalez, Peter W. Battaglia
2020-10-07, ICLR 2021
- Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability [pdf] [pdf with comments]
- Jeremy M. Cohen, Simran Kaur, Yuanzhi Li, J. Zico Kolter, Ameet Talwalkar
2021-02-26, ICLR 2021
- [Theoretical Properties of Deep Learning]
- On the Origin of Implicit Regularization in Stochastic Gradient Descent [pdf] [pdf with comments]
- Samuel L. Smith, Benoit Dherin, David G. T. Barrett, Soham De
2021-01-28, ICLR 2021
- [Theoretical Properties of Deep Learning]
- No MCMC for Me: Amortized Sampling for Fast and Stable Training of Energy-Based Models [pdf] [code] [pdf with comments]
- Will Grathwohl, Jacob Kelly, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud
2020-10-08, ICLR 2021
- [Energy-Based Models]
- Getting a CLUE: A Method for Explaining Uncertainty Estimates [pdf] [pdf with comments]
- Javier Antorán, Umang Bhatt, Tameem Adel, Adrian Weller, José Miguel Hernández-Lobato
2020-06-11, ICLR 2021
- [Uncertainty Estimation]
- Score-Based Generative Modeling through Stochastic Differential Equations [pdf] [code] [pdf with comments]
- Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
2020-11-26, ICLR 2021
- [Neural ODEs]
- Rethinking Attention with Performers [pdf] [pdf with comments]
- Krzysztof Choromanski, Valerii Likhosherstov, David Dohan, Xingyou Song, Andreea Gane, Tamas Sarlos, Peter Hawkins, Jared Davis, Afroz Mohiuddin, Lukasz Kaiser, David Belanger, Lucy Colwell, Adrian Weller
2020-10-30, ICLR 2021
- [Transformers]
- Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images [pdf] [code] [pdf with comments]
- Rewon Child
2020-11-20, ICLR 2021
- [VAEs]
- VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models [pdf] [pdf with comments]
- Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
2020-10-01, ICLR 2021
- [Energy-Based Models] [VAEs]
- BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning [pdf] [code] [video] [pdf with comments] [comments]
- Yeming Wen, Dustin Tran, Jimmy Ba
2020-02-17, ICLR 2020
- [Uncertainty Estimation] [Ensembling]
- Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning [pdf] [code] [pdf with comments] [comments]
- Arsenii Ashukha, Alexander Lyzhov, Dmitry Molchanov, Dmitry Vetrov
2020-02-15, ICLR 2020
- [Uncertainty Estimation] [Ensembling] [Stochastic Gradient MCMC]
- Conservative Uncertainty Estimation By Fitting Prior Networks [pdf] [pdf with comments] [comments]
- Kamil Ciosek, Vincent Fortuin, Ryota Tomioka, Katja Hofmann, Richard Turner
2019-10-25, ICLR 2020
- [Uncertainty Estimation]
- Convolutional Conditional Neural Processes [pdf] [code] [pdf with comments] [comments]
- Jonathan Gordon, Wessel P. Bruinsma, Andrew Y. K. Foong, James Requeima, Yann Dubois, Richard E. Turner
2019-10-29, ICLR 2020
- [Neural Processes]
- Multiplicative Interactions and Where to Find Them [pdf] [pdf with comments] [comments]
- Siddhant M. Jayakumar, Jacob Menick, Wojciech M. Czarnecki, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu
2019-09-25, ICLR 2020
- [Theoretical Properties of Deep Learning] [Sequence Modeling]
- Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One [pdf] [pdf with comments] [comments]
- Will Grathwohl, Kuan-Chieh Wang, Jörn-Henrik Jacobsen, David Duvenaud, Mohammad Norouzi, Kevin Swersky
2019-12-06, ICLR 2020
- [Energy-Based Models]
- Trellis Networks for Sequence Modeling [pdf] [code] [pdf with comments] [comments]
- Shaojie Bai, J. Zico Kolter, Vladlen Koltun
2018-10-15, ICLR2019
- Generating High Fidelity Images with Subscale Pixel Networks and Multidimensional Upscaling [pdf] [pdf with comments] [comments]
- Jacob Menick, Nal Kalchbrenner
2018-12-04, ICLR2019
- Meta-Learning For Stochastic Gradient MCMC [pdf] [code] [slides] [pdf with comments] [summary (TODO!)]
- Wenbo Gong, Yingzhen Li, José Miguel Hernández-Lobato
2018-10-28, ICLR2019
- When Recurrent Models Don't Need To Be Recurrent (a.k.a. Stable Recurrent Models) [pdf] [pdf with comments] [summary]
- John Miller, Moritz Hardt
2018-05-29, ICLR2019
- The Lottery Ticket Hypothesis: Finding Small, Trainable Neural Networks [pdf] [pdf with comments] [summary]
- Jonathan Frankle, Michael Carbin
2018-03-09, ICLR2019
- Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes [pdf] [pdf with comments] [summary]
- Roman Novak, Lechao Xiao, Jaehoon Lee, Yasaman Bahri, Daniel A. Abolafia, Jeffrey Pennington, Jascha Sohl-Dickstein
2018-10-11, ICLR2019
- Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks [pdf] [code] [annotated pdf]
- Shiyu Liang, Yixuan Li, R. Srikant
2017-06-08, ICLR 2018
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. Two simple modifications of the "maximum softmax score" baseline, and the performance is consistently improved. The input perturbation method is quite interesting. Intuitively, it's not entirely clear to me why it actually works.
- Gaussian Process Behaviour in Wide Deep Neural Networks [pdf] [pdf with comments] [summary]
- Alexander G. de G. Matthews, Mark Rowland, Jiri Hron, Richard E. Turner, Zoubin Ghahramani
2018-08-16, ICLR2018
- A recurrent neural network without chaos [pdf] [pdf with comments] [comments]
- Thomas Laurent, James von Brecht
2016-12-19, ICLR2017
- Auto-Encoding Variational Bayes [pdf] [pdf with comments] [comments]
- Diederik P Kingma, Max Welling
2014-05-01, ICLR2014
- Probabilistic 3D Human Shape and Pose Estimation from Multiple Unconstrained Images in the Wild [pdf] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-03-19, CVPR 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they predict a single Gaussian distribution for the pose (instead of hierarchical matrix-Fisher distributions). Also, they mainly focus on the body shape. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- SMD-Nets: Stereo Mixture Density Networks [pdf] [code] [annotated pdf]
- Fabio Tosi, Yiyi Liao, Carolin Schmitt, Andreas Geiger
2021-04-08, CVPR 2021
- [Uncertainty Estimation]
Well-written and interesting paper. Quite easy to read and follow, the method is clearly explained and makes intuitive sense.
- We are More than Our Joints: Predicting how 3D Bodies Move [pdf] [code] [annotated pdf]
- Yan Zhang, Michael J. Black, Siyu Tang
2020-12-01, CVPR 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. The marker-based representation, instead of using skeleton joints, makes sense. The recursive projection scheme also makes sense, but seems very slow (2.27 sec/frame)? I didn't quite get all the details for their DCT representation of the latent space.
- DI-Fusion: Online Implicit 3D Reconstruction with Deep Priors [pdf] [code] [annotated pdf]
- Jiahui Huang, Shi-Sheng Huang, Haoxuan Song, Shi-Min Hu
2020-12-10, CVPR 2021
- [Implicit Neural Representations]
Well-written and interesting paper, I enjoyed reading it. Neat application of implicit representations. The paper also gives a quite good overview of online 3D reconstruction in general.
- Beyond Static Features for Temporally Consistent 3D Human Pose and Shape from a Video [pdf] [code] [annotated pdf]
- Hongsuk Choi, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee
2020-11-17, CVPR 2021
- [3D Human Pose Estimation]
- Meta Pseudo Labels [pdf] [code] [pdf with comments]
- Hieu Pham, Zihang Dai, Qizhe Xie, Minh-Thang Luong, Quoc V. Le
2020-03-23, CVPR 2021
- Local Implicit Grid Representations for 3D Scenes [pdf] [code] [annotated pdf]
- Chiyu Max Jiang, Avneesh Sud, Ameesh Makadia, Jingwei Huang, Matthias Nießner, Thomas Funkhouser
2020-03-19, CVPR 2020
- [Implicit Neural Representations]
Well-written and quite interesting paper. Interesting application, being able to reconstruct full 3D scenes from sparse point clouds. I didn't fully understand everything, as I don't have a particularly strong graphics background.
- Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [pdf] [code] [pdf with comments] [comments]
- Tian Han, Erik Nijkamp, Linqi Zhou, Bo Pang, Song-Chun Zhu, Ying Nian Wu
2020-06-10, CVPR 2020
- [VAEs] [Energy-Based Models]
- Flow Contrastive Estimation of Energy-Based Models [pdf] [pdf with comments] [comments]
- Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, Ying Nian Wu
2019-12-02, CVPR 2020
- [Energy-Based Models] [Normalizing Flows]
- DeepSDF: Learning Continuous Signed Distance Functions for Shape Representation [pdf] [code] [annotated pdf]
- Jeong Joon Park, Peter Florence, Julian Straub, Richard Newcombe, Steven Lovegrove
2019-01-16, CVPR 2019
- [Implicit Neural Representations]
- Generating Multiple Hypotheses for 3D Human Pose Estimation with Mixture Density Network [pdf] [code] [annotated pdf]
- Chen Li, Gim Hee Lee
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
- Expressive Body Capture: 3D Hands, Face, and Body from a Single Image [pdf] [code] [annotated pdf]
- Georgios Pavlakos, Vasileios Choutas, Nima Ghorbani, Timo Bolkart, Ahmed A. A. Osman, Dimitrios Tzionas, Michael J. Black
2019-04-11, CVPR 2019
- [3D Human Pose Estimation]
Very well-written and quite interesting paper. Gives a good understanding of the SMPL model and the SMPLify method.
- PointRCNN: 3D Object Proposal Generation and Detection from Point Cloud [pdf] [code] [pdf with comments] [comments]
- Shaoshuai Shi, Xiaogang Wang, Hongsheng Li
2018-12-11, CVPR2019
- ATOM: Accurate Tracking by Overlap Maximization [pdf] [code] [pdf with comments] [comments]
- Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg
2018-11-19, CVPR2019
- LaserNet: An Efficient Probabilistic 3D Object Detector for Autonomous Driving [pdf] [pdf with comments] [comments]
- Gregory P. Meyer, Ankit Laddha, Eric Kee, Carlos Vallespi-Gonzalez, Carl K. Wellington
2019-03-20, CVPR2019
- End-to-end Recovery of Human Shape and Pose [pdf] [code] [annotated pdf]
- Angjoo Kanazawa, Michael J. Black, David W. Jacobs, Jitendra Malik
2017-12-18, CVPR 2018
- [3D Human Pose Estimation]
- VoxelNet: End-to-End Learning for Point Cloud Based 3D Object Detection [pdf] [pdf with comments] [summary]
- Yin Zhou, Oncel Tuzel
2017-11-17, CVPR2018
- PIXOR: Real-time 3D Object Detection from Point Clouds [pdf] [pdf with comments] [summary]
- Bin Yang, Wenjie Luo, Raquel Urtasun
CVPR2018
- Lightweight Probabilistic Deep Networks [pdf] [pdf with comments] [summary]
- Jochen Gast, Stefan Roth
2018-05-29, CVPR2018
- Learning Weight Uncertainty with Stochastic Gradient MCMC for Shape Classification [pdf] [poster] [pdf with comments] [comments]
- Chunyuan Li, Andrew Stevens, Changyou Chen, Yunchen Pu, Zhe Gan, Lawrence Carin
CVPR2016
- NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis [pdf] [code] [annotated pdf]
- Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, Ren Ng
2020-03-19, ECCV 2020
- [Implicit Neural Representations]
Extremely well-written and interesting paper. I really enjoyed reading it, and I would recommend anyone interested in computer vision to read it as well.
All parts of the proposed method are clearly explained and relatively easy to understand, including the volume rendering techniques which I was unfamiliar with.
- End-to-End Object Detection with Transformers [pdf] [code] [pdf with comments] [comments]
- Nicolas Carion, Francisco Massa, Gabriel Synnaeve, Nicolas Usunier, Alexander Kirillov, Sergey Zagoruyko
2020-05-26, ECCV 2020
- [Object Detection]
- Acquisition of Localization Confidence for Accurate Object Detection [pdf] [code] [oral presentation] [pdf with comments] [comments]
- Borui Jiang, Ruixuan Luo, Jiayuan Mao, Tete Xiao, Yuning Jiang
2018-07-30, ECCV2018
- Uncertainty Estimates and Multi-Hypotheses Networks for Optical Flow [pdf] [pdf with comments] [summary]
- Eddy Ilg, Özgün Çiçek, Silvio Galesso, Aaron Klein, Osama Makansi, Frank Hutter, Thomas Brox
2018-08-06, ECCV2018
- Keep it SMPL: Automatic Estimation of 3D Human Pose and Shape from a Single Image [pdf] [annotated pdf]
- Federica Bogo, Angjoo Kanazawa, Christoph Lassner, Peter Gehler, Javier Romero, Michael J. Black
2016-07-27, ECCV 2016
- [3D Human Pose Estimation]
- Learning Motion Priors for 4D Human Body Capture in 3D Scenes [pdf] [code] [annotated pdf]
- Siwei Zhang, Yan Zhang, Federica Bogo, Marc Pollefeys, Siyu Tang
2021-08-23, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I didn't fully understand everything though, and it feels like I probably don't know this specific setting/problem well enough to fully appreciate the paper.
- Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-10-03, ICCV 2021
- [3D Human Pose Estimation]
Well-written and very interesting paper, I enjoyed reading it. The hierarchical distribution prediction approach makes sense and consistently outperforms the independent baseline. Using matrix-Fisher distributions makes sense. The synthetic training framework and the input representation of edge-filters + 2D keypoint heatmaps are both interesting.
- imGHUM: Implicit Generative Models of 3D Human Shape and Articulated Pose [pdf] [code] [annotated pdf]
- Thiemo Alldieck, Hongyi Xu, Cristian Sminchisescu
2021-08-24, ICCV 2021
- [3D Human Pose Estimation] [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it. Interesting combination of implicit representations and 3D human modelling. The "inclusive human modelling" application is neat and important.
- Contextually Plausible and Diverse 3D Human Motion Prediction [pdf] [annotated pdf]
- Sadegh Aliakbarian, Fatemeh Sadat Saleh, Lars Petersson, Stephen Gould, Mathieu Salzmann
2019-12-18, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The main idea, using a learned conditional prior p(z|c) instead of just p(z), makes sense and was shown beneficial also in "HuMoR: 3D Human Motion Model for Robust Pose Estimation". I'm however somewhat confused by their specific implementation in Section 4, doesn't seem like a standard cVAE implementation?
- Encoder-decoder with Multi-level Attention for 3D Human Shape and Pose Estimation [pdf] [code] [annotated pdf]
- Ziniu Wan, Zhengjia Li, Maoqing Tian, Jianbo Liu, Shuai Yi, Hongsheng Li
2021-09-06, ICCV 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. Quite a lot of details on the attention architecture, which I personally don't find overly interesting. The experimental results are quite impressive, but I would like to see a comparison in terms of computational cost at test-time. It sounds like their method is rather slow.
- Physics-based Human Motion Estimation and Synthesis from Videos [pdf] [annotated pdf]
- Kevin Xie, Tingwu Wang, Umar Iqbal, Yunrong Guo, Sanja Fidler, Florian Shkurti
2021-09-21, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The general idea, refining frame-by-frame pose estimates via physical constraints, intuitively makes a lot of sense. I did however find it quite difficult to understand all the details in Section 3.
- Human Pose Regression with Residual Log-likelihood Estimation [pdf] [code] [annotated pdf]
- Jiefeng Li, Siyuan Bian, Ailing Zeng, Can Wang, Bo Pang, Wentao Liu, Cewu Lu
2021-07-23, ICCV 2021
- [3D Human Pose Estimation]
Quite interesting paper, but also quite strange/confusing. I don't think the proposed method is explained particularly well, at least I found it quite difficult to properly understand what they actually are doing.
In the end it seems like they are learning a global loss function that is very similar to doing probabilistic regression with a Gauss/Laplace model of p(y|x) (with learned mean and variance)? See Figure 4 in the Appendix.
And while it's true that their performance is much better than for direct regression with an L2/L1 loss (see e.g. Table 1), they only compare with Gauss/Laplace probabilistic regression once (Table 7) and in that case the Laplace model is actually quite competitive?
- Estimating Egocentric 3D Human Pose in Global Space [pdf] [annotated pdf]
- Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Christian Theobalt
2021-04-27, ICCV 2021
- [3D Human Pose Estimation]
- HuMoR: 3D Human Motion Model for Robust Pose Estimation [pdf] [code] [annotated pdf]
- Davis Rempe, Tolga Birdal, Aaron Hertzmann, Jimei Yang, Srinath Sridhar, Leonidas J. Guibas
2021-05-10, ICCV 2021
- [3D Human Pose Estimation]
- Learning to Reconstruct 3D Human Pose and Shape via Model-fitting in the Loop [pdf] [code] [annotated pdf]
- Nikos Kolotouros, Georgios Pavlakos, Michael J. Black, Kostas Daniilidis
2019-09-27, ICCV 2019
- [3D Human Pose Estimation]
- A simple yet effective baseline for 3d human pose estimation [pdf] [code] [annotated pdf]
- Julieta Martinez, Rayat Hossain, Javier Romero, James J. Little
2017-05-08, ICCV 2017
- [3D Human Pose Estimation]
- Synthetic Training for Accurate 3D Human Pose and Shape Estimation in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2020-09-21, BMVC 2020
- [3D Human Pose Estimation]
Well-written and farily interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they just use direct regression. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Being a Bit Frequentist Improves Bayesian Neural Networks [pdf] [code] [annotated pdf]
- Agustinus Kristiadi, Matthias Hein, Philipp Hennig
2021-06-18, AISTATS 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method makes intuitive sense, trying to incorporate the "OOD training" method (i.e., to use some kind of OOD data during training, similar to e.g. the "Deep Anomaly Detection with Outlier Exposure" paper) into the Bayesian deep learning approach. The experimental results do seem quite promising.
- Evaluating model calibration in classification [pdf] [code] [pdf with comments] [comments]
- Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön
2019-02-19, AISTATS 2019
- Noise-contrastive estimation: A new estimation principle for unnormalized statistical models [pdf] [pdf with comments] [comments]
- Michael Gutmann, Aapo Hyvärinen
2009, AISTATS 2010
- [Energy-Based Models]
- Out of Distribution Data Detection Using Dropout Bayesian Neural Networks [pdf] [annotated pdf]
- Andre T. Nguyen, Fred Lu, Gary Lopez Munoz, Edward Raff, Charles Nicholas, James Holt
2022-02-18, AAAI 2022
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. It seemed quite niche at first, but I think their analysis could potentially be useful.
- On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models [pdf] [code] [pdf with comments] [comments]
- Erik Nijkamp, Mitch Hill, Tian Han, Song-Chun Zhu, Ying Nian Wu
2019-04-29, AAAI 2020
- [Energy-Based Models]
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Learning nonlinear state-space models using deep autoencoders [pdf] [pdf with comments] [comments]
- Daniele Masti, Alberto Bemporad
2018, CDC2018
- Estimation of Non-Normalized Statistical Models by Score Matching [pdf] [pdf with comments] [comments]
- Aapo Hyvärinen
2004-11, JMLR 6
- [Energy-Based Models]
- A Deep Bayesian Neural Network for Cardiac Arrhythmia Classification with Rejection from ECG Recordings [pdf] [code] [annotated pdf]
- Wenrui Zhang, Xinxin Di, Guodong Wei, Shijia Geng, Zhaoji Fu, Shenda Hong
2022-02-26
- [Uncertainty Estimation] [Medical ML]
Somewhat interesting paper. They use a softmax model with MC-dropout to compute uncertainty estimates. The evaluation is not very extensive, they mostly just check that the classification accuracy improves as they reject more and more samples based on a uncertainty threshold.
- Out of Distribution Data Detection Using Dropout Bayesian Neural Networks [pdf] [annotated pdf]
- Andre T. Nguyen, Fred Lu, Gary Lopez Munoz, Edward Raff, Charles Nicholas, James Holt
2022-02-18, AAAI 2022
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. It seemed quite niche at first, but I think their analysis could potentially be useful.
- UncertaINR: Uncertainty Quantification of End-to-End Implicit Neural Representations for Computed Tomography [pdf] [annotated pdf]
- Francisca Vasconcelos, Bobby He, Nalini Singh, Yee Whye Teh
2022-02-22
- [Implicit Neural Representations] [Uncertainty Estimation] [Medical ML]
Interesting and well-written paper. I wasn't very familiar with CT image reconstruction, but they do a good job explaining everything. Interesting that MC-dropout seems important for getting well-calibrated predictions.
- Robust Uncertainty Estimates with Out-of-Distribution Pseudo-Inputs Training [pdf] [annotated pdf]
- Pierre Segonne, Yevgen Zainchkovskyy, Søren Hauberg
2022-01-15
- [Uncertainty Estimation]
Somewhat interesting paper. I didn't quite understand everything, so it could be more interesting than I think. The fact that their pseudo-input generation process "relies on the availability of a differentiable density estimate of the data" seems like a big limitation? For regression, they only applied their method to very low-dimensional input data (1D toy regression and UCI benchmarks), but would this work for image-based tasks?
- VOS: Learning What You Don't Know by Virtual Outlier Synthesis [pdf] [code] [annotated pdf]
- Xuefeng Du, Zhaoning Wang, Mu Cai, Yixuan Li
2022-02-02, ICLR 2022
- [Out-of-Distribution Detection]
Interesting and quite well-written paper. I did find it somewhat difficult to understand certain parts though, they could perhaps be explained more clearly. The results seem quite impressive (they do consistently outperform all baselines), but I find it interesting that the "Gaussian noise" baseline in Table 2 performs that well? I should probably have read "Energy-based Out-of-distribution Detection" before reading this paper.
- Transformers Can Do Bayesian Inference [pdf] [code] [annotated pdf]
- Samuel Müller, Noah Hollmann, Sebastian Pineda Arango, Josif Grabocka, Frank Hutter
2021-12-20, ICLR 2022
- [Transformers]
Quite interesting and well-written paper. I did however find it difficult to properly understand everything, it feels like a lot of details are omitted (I wouldn't really know how to actually implement this in practice). It's difficult for me to judge how impressive the results are or how practically useful this approach actually might be, what limitations are there? Overall though, it does indeed seem quite interesting.
- Confidence-based Out-of-Distribution Detection: A Comparative Study and Analysis [pdf] [code] [annotated pdf]
- Christoph Berger, Magdalini Paschali, Ben Glocker, Konstantinos Kamnitsas
2021-07-06, MICCAI Workshops 2021
- [Out-of-Distribution Detection] [Medical ML]
Interesting and well-written paper. Interesting that Mahalanobis works very well on the CIFAR10 vs SVHN but not on the medical imaging dataset. I don't quite get how/why the ODIN method works, I'll probably have to read that paper.
- Deep Learning Through the Lens of Example Difficulty [pdf] [annotated pdf]
- Robert John Nicholas Baldock, Hartmut Maennel, Behnam Neyshabur
2021-05-21, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite interesting and well-written paper. The definition of "prediction depth" in Section 2.1 makes sense, and it definitely seems reasonable that this could correlate with example difficulty / prediction confidence in some way. Section 3 and 4, and all the figures, contain a lot of info it seems, I'd probably need to read the paper again to properly understand/appreciate everything.
- Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions [pdf] [annotated pdf]
- Abhijit Guha Roy, Jie Ren, Shekoofeh Azizi, Aaron Loh, Vivek Natarajan, Basil Mustafa, Nick Pawlowski, Jan Freyberg, Yuan Liu, Zach Beaver, Nam Vo, Peggy Bui, Samantha Winter, Patricia MacWilliams, Greg S. Corrado, Umesh Telang, Yun Liu, Taylan Cemgil, Alan Karthikesalingam, Balaji Lakshminarayanan, Jim Winkens
2021-04-08, Medical Image Analysis (January 2022)
- [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. Quite long, so it took a bit longer than usual to read it. Section 1 and 2 gives a great overview of OOD detection in general, and how it can be used specifically in this dermatology setting. I can definitely recommend reading Section 2 (Related work). They assume access to some outlier data during training, so their approach is similar to the "Outlier exposure" method (specifically in this dermatology setting, they say that this is a fair assumption). Their method is an improvement of the "reject bucket" (add an extra class which you assign to all outlier training data points), in their proposed method they also use fine-grained classification of the outlier skin conditions. Then they also use an ensemble of 5 models, and also a more diverse ensemble (in which they combine models trained with different representation learning techniques). This diverse ensemble obtains the best performance.
- Being a Bit Frequentist Improves Bayesian Neural Networks [pdf] [code] [annotated pdf]
- Agustinus Kristiadi, Matthias Hein, Philipp Hennig
2021-06-18, AISTATS 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method makes intuitive sense, trying to incorporate the "OOD training" method (i.e., to use some kind of OOD data during training, similar to e.g. the "Deep Anomaly Detection with Outlier Exposure" paper) into the Bayesian deep learning approach. The experimental results do seem quite promising.
- Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning [pdf] [code] [annotated pdf]
- Runa Eschenhagen, Erik Daxberger, Philipp Hennig, Agustinus Kristiadi
2021-11-95, NeurIPS Workshops 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Well-written and interesting paper. Short paper of just 3 pages, but with an extensive appendix which I definitely recommend going through. The method, training an ensemble and then applying the Laplace approximation to each network, is very simple and intuitively makes a lot of sense. I didn't realize that this would have basically the same test-time speed as ensembling (since they utilize that probit approximation), that's very neat. It also seems to consistently outperform ensembling a bit across almost all tasks and metrics.
- Pessimistic Bootstrapping for Uncertainty-Driven Offline Reinforcement Learning [pdf] [annotated pdf]
- Chenjia Bai, Lingxiao Wang, Zhuoran Yang, Zhi-Hong Deng, Animesh Garg, Peng Liu, Zhaoran Wang
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with RL, which makes it a bit difficult for me to properly evaluate the paper's contributions. They use standard ensembles for uncertainty estimation combined with an OOD sampling regularization. I thought that the OOD sampling could be interesting, but it seems very specific to RL. I'm sure this paper is quite interesting for people doing RL, but I don't think it's overly useful for me.
- On the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks [pdf] [code] [annotated pdf]
- Maximilian Seitzer, Arash Tavakoli, Dimitrije Antic, Georg Martius
2021-09-29, ICLR 2022
- [Uncertainty Estimation]
Quite interesting and very well-written paper, I enjoyed reading it. Their analysis of fitting Gaussian regression models via the NLL is quite interesting, I didn't really expect to learn something new about this. I've seen Gaussian models outperform standard regression (L2 loss) w.r.t. accuracy in some applications/datasets, and it being the other way around in others. In the first case, I've then attributed the success of the Gaussian model to the "learned loss attenuation". The analysis in this paper could perhaps explain why you get this performance boost only in certain applications. Their beta-NLL loss could probably be quite useful, seems like a convenient tool to have.
- Sample Efficient Deep Reinforcement Learning via Uncertainty Estimation [pdf] [annotated pdf]
- Vincent Mai, Kaustubh Mani, Liam Paull
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Reinforcement Learning]
Well-written and somewhat interesting paper. I'm not overly familiar with reinforcement learning, which makes it a bit difficult for me to properly evaluate the paper's contributions, but to me it seems like fairly straightforward method modifications? To use ensembles of Gaussian models (instead of ensembles of models trained using the L2 loss) makes sense. The BIV method I didn't quite get, it seems rather ad hoc? I also don't quite get exactly how it's used in equation (10), is the ensemble of Gaussian models trained _jointly_ using this loss? I don't really know if this could be useful outside of RL.
- Laplace Redux -- Effortless Bayesian Deep Learning [pdf] [code] [annotated pdf]
- Erik Daxberger, Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Matthias Bauer, Philipp Hennig
2021-06-28, NeurIPS 2021
- [Uncertainty Estimation]
Interesting and very well-written paper, I enjoyed reading it. I still think that ensembling probably is quite difficult to beat purely in terms of uncertainty estimation quality, but this definitely seems like a useful tool in many situations. It's not clear to me if the analytical expression for regression in "4. Approximate Predictive Distribution" is applicable also if the variance is input-dependent?
- Benchmarking Uncertainty Quantification on Biosignal Classification Tasks under Dataset Shift [pdf] [annotated pdf]
- Tong Xia, Jing Han, Cecilia Mascolo
2021-12-16, AAAI Workshops 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Well-written and interesting paper. They synthetically create dataset shifts (e.g. by adding Gaussian noise to the data) of increasing intensity and study whether or not the uncertainty increases as the accuracy degrades. They compare regular softmax, temperature scaling, MC-dropout, ensembling and a simple variational inference method. Their conclusion is basically that ensembling slightly outperforms the other methods, but that no method performs overly well. I think these type of studies are really useful.
- On Out-of-distribution Detection with Energy-based Models [pdf] [code] [annotated pdf]
- Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
2021-07-03, ICML Workshops 2021
- [Out-of-Distribution Detection] [Energy-Based Models]
Well-written and quite interesting paper. A short paper, just 4 pages. They don't study the method from the "Energy-based Out-of-distribution Detection" paper as I had expected, but it was still a quite interesting read. The results in Section 4.2 seem interesting, especially for experiment 3, but I'm not sure that I properly understand everything.
- Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions [pdf] [annotated pdf]
- Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
2021-09-29, ICLR 2022
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Interesting and well-written paper. I didn't quite understand all the details, I'll have to read a couple of related/background papers to be able to properly appreciate and evaluate the proposed method. I definitely feel like I would like to read up on this family of methods. Extensive experimental evaluation, and the results seem promising overall.
- Efficiently Modeling Long Sequences with Structured State Spaces [pdf] [code] [annotated pdf]
- Albert Gu, Karan Goel, Christopher Ré
2021-10-31, ICLR 2022
- [Sequence Modeling]
Very interesting and quite well-written paper. Kind of neat/fun to see state-space models being used. The experimental results seem very impressive!? I didn't fully understand everything in Section 3. I had to read Section 3.4 a couple of times to understand how the parameterization actually works in practice (you have H state-space models, one for each feature dimension, so that you can map a sequence of feature vectors to another sequence of feature vectors) (and you can then also have multiple such layers of state-space models, mapping sequence --> sequence --> sequence --> ....).
- Periodic Activation Functions Induce Stationarity [pdf] [code] [annotated pdf]
- Lassi Meronen, Martin Trapp, Arno Solin
2021-10-26, NeurIPS 2021
- [Uncertainty Estimation] [Out-of-Distribution Detection]
Quite interesting and well-written paper. Quite a heavy read, probably need to be rather familiar with GPs to properly understand/appreciate everything. Definitely check Appendix D, it gives a better understanding of how the proposed method is applied in practice. I'm not quite sure how strong/impressive the experimental results actually are. Also seems like the method could be a bit inconvenient to implement/use?
- Reliable and Trustworthy Machine Learning for Health Using Dataset Shift Detection [pdf] [annotated pdf]
- Chunjong Park, Anas Awadalla, Tadayoshi Kohno, Shwetak Patel
2021-10-26, NeurIPS 2021
- [Out-of-Distribution Detection]
Interesting and very well-written paper. Gives a good overview of the field and contains a lot of seemingly useful references. The evaluation is very comprehensive. The user study is quite neat.
- An Information-theoretic Approach to Distribution Shifts [pdf] [code] [annotated pdf]
- Marco Federici, Ryota Tomioka, Patrick Forré
2021-06-07, NeurIPS 2021
- [Theoretical Properties of Deep Learning]
Quite well-written paper overall that seemed interesting, but I found it very difficult to properly understand everything. Thus, I can't really tell how interesting/significant their analysis actually is.
- On the Importance of Gradients for Detecting Distributional Shifts in the Wild [pdf] [code] [annotated pdf]
- Rui Huang, Andrew Geng, Yixuan Li
2021-10-01, NeurIPS 2021
- [Out-of-Distribution Detection]
Quite interesting and well-written paper. The experimental results do seem promising. However, I don't quite get why the proposed method intuitively makes sense, why is it better to only use the parameters of the final network layer?
- Masked Autoencoders Are Scalable Vision Learners [pdf] [annotated pdf]
- Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Dollár, Ross Girshick
2021-11-11
Interesting and well-written paper. The proposed method is simple and makes a lot of intuitive sense, which is rather satisfying. After page 4, there's mostly just detailed ablations and results.
- Deep Classifiers with Label Noise Modeling and Distance Awareness [pdf] [annotated pdf]
- Vincent Fortuin, Mark Collier, Florian Wenzel, James Allingham, Jeremiah Liu, Dustin Tran, Balaji Lakshminarayanan, Jesse Berent, Rodolphe Jenatton, Effrosyni Kokiopoulou
2021-10-06
- [Uncertainty Estimation]
Quite interesting and well-written paper. I find the distance-awareness property more interesting than modelling of input/class-dependent label noise, so the proposed method (HetSNGP) is perhaps not overly interesting compared to the SNGP baseline.
- Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets [pdf] [code] [annotated pdf]
- Alethea Power, Yuri Burda, Harri Edwards, Igor Babuschkin, Vedant Misra
2021-05, ICLR Workshops 2021
- [Theoretical Properties of Deep Learning]
Somewhat interesting paper. The phenomena observed in Figure 1, that validation accuracy suddenly increases long after almost perfect fitting of the training data has been achieved is quite interesting. I didn't quite understand the datasets they use (binary operation tables).
- Probabilistic 3D Human Shape and Pose Estimation from Multiple Unconstrained Images in the Wild [pdf] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-03-19, CVPR 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they predict a single Gaussian distribution for the pose (instead of hierarchical matrix-Fisher distributions). Also, they mainly focus on the body shape. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- Learning Motion Priors for 4D Human Body Capture in 3D Scenes [pdf] [code] [annotated pdf]
- Siwei Zhang, Yan Zhang, Federica Bogo, Marc Pollefeys, Siyu Tang
2021-08-23, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. I didn't fully understand everything though, and it feels like I probably don't know this specific setting/problem well enough to fully appreciate the paper.
- Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2021-10-03, ICCV 2021
- [3D Human Pose Estimation]
Well-written and very interesting paper, I enjoyed reading it. The hierarchical distribution prediction approach makes sense and consistently outperforms the independent baseline. Using matrix-Fisher distributions makes sense. The synthetic training framework and the input representation of edge-filters + 2D keypoint heatmaps are both interesting.
- SMD-Nets: Stereo Mixture Density Networks [pdf] [code] [annotated pdf]
- Fabio Tosi, Yiyi Liao, Carolin Schmitt, Andreas Geiger
2021-04-08, CVPR 2021
- [Uncertainty Estimation]
Well-written and interesting paper. Quite easy to read and follow, the method is clearly explained and makes intuitive sense.
- imGHUM: Implicit Generative Models of 3D Human Shape and Articulated Pose [pdf] [code] [annotated pdf]
- Thiemo Alldieck, Hongyi Xu, Cristian Sminchisescu
2021-08-24, ICCV 2021
- [3D Human Pose Estimation] [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it. Interesting combination of implicit representations and 3D human modelling. The "inclusive human modelling" application is neat and important.
- Physics-based Human Motion Estimation and Synthesis from Videos [pdf] [annotated pdf]
- Kevin Xie, Tingwu Wang, Umar Iqbal, Yunrong Guo, Sanja Fidler, Florian Shkurti
2021-09-21, ICCV 2021
- [3D Human Pose Estimation]
Well-written and quite interesting paper. The general idea, refining frame-by-frame pose estimates via physical constraints, intuitively makes a lot of sense. I did however find it quite difficult to understand all the details in Section 3.
- Hierarchical VAEs Know What They Don't Know [pdf] [code] [annotated pdf]
- Jakob D. Havtorn, Jes Frellsen, Søren Hauberg, Lars Maaløe
2021-02-16, ICML 2021
- [Uncertainty Estimation] [VAEs]
Very well-written and quite interesting paper, I enjoyed reading it. Everything is quite well-explained, it's relatively easy to follow. The paper provides a good overview of the out-of-distribution detection problem and current methods.
- Human Pose Regression with Residual Log-likelihood Estimation [pdf] [code] [annotated pdf]
- Jiefeng Li, Siyuan Bian, Ailing Zeng, Can Wang, Bo Pang, Wentao Liu, Cewu Lu
2021-07-23, ICCV 2021
- [3D Human Pose Estimation]
Quite interesting paper, but also quite strange/confusing. I don't think the proposed method is explained particularly well, at least I found it quite difficult to properly understand what they actually are doing.
In the end it seems like they are learning a global loss function that is very similar to doing probabilistic regression with a Gauss/Laplace model of p(y|x) (with learned mean and variance)? See Figure 4 in the Appendix.
And while it's true that their performance is much better than for direct regression with an L2/L1 loss (see e.g. Table 1), they only compare with Gauss/Laplace probabilistic regression once (Table 7) and in that case the Laplace model is actually quite competitive?
- Revisiting the Calibration of Modern Neural Networks [pdf] [code] [annotated pdf]
- Matthias Minderer, Josip Djolonga, Rob Romijnders, Frances Hubis, Xiaohua Zhai, Neil Houlsby, Dustin Tran, Mario Lucic
2021-06-15, NeurIPS 2021
- [Uncertainty Estimation]
Well-written paper. Everything is quite clearly explained and easy to understand. Quite enjoyable to read overall.
Thorough experimental evaluation. Quite interesting findings.
- Differentiable Particle Filtering via Entropy-Regularized Optimal Transport [pdf] [code] [annotated pdf]
- Adrien Corenflos, James Thornton, George Deligiannidis, Arnaud Doucet
2021-02-15, ICML 2021
- Character Controllers Using Motion VAEs [pdf] [code] [annotated pdf]
- Hung Yu Ling, Fabio Zinno, George Cheng, Michiel van de Panne
2021-03-26, SIGGRAPH 2020
- [3D Human Pose Estimation]
- Estimating Egocentric 3D Human Pose in Global Space [pdf] [annotated pdf]
- Jian Wang, Lingjie Liu, Weipeng Xu, Kripasindhu Sarkar, Christian Theobalt
2021-04-27, ICCV 2021
- [3D Human Pose Estimation]
- HuMoR: 3D Human Motion Model for Robust Pose Estimation [pdf] [code] [annotated pdf]
- Davis Rempe, Tolga Birdal, Aaron Hertzmann, Jimei Yang, Srinath Sridhar, Leonidas J. Guibas
2021-05-10, ICCV 2021
- [3D Human Pose Estimation]
- PixelTransformer: Sample Conditioned Signal Generation [pdf] [code] [annotated pdf]
- Shubham Tulsiani, Abhinav Gupta
2021-03-29, ICML 2021
- [Neural Processes] [Transformers]
- Stiff Neural Ordinary Differential Equations [pdf] [annotated pdf]
- Suyong Kim, Weiqi Ji, Sili Deng, Yingbo Ma, Christopher Rackauckas
2021-03-29
- [Neural ODEs]
- Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling [pdf] [code] [annotated pdf]
- Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson
2021-02-25, ICML 2021
- [Uncertainty Estimation] [Ensembling]
- Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability [pdf] [pdf with comments]
- Jeremy M. Cohen, Simran Kaur, Yuanzhi Li, J. Zico Kolter, Ameet Talwalkar
2021-02-26, ICLR 2021
- [Theoretical Properties of Deep Learning]
- Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations [pdf] [pdf with comments]
- Winnie Xu, Ricky T.Q. Chen, Xuechen Li, David Duvenaud
2021-02-12
- [Neural ODEs] [Uncertainty Estimation]
- Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision [pdf] [pdf with comments]
- Chao Jia, Yinfei Yang, Ye Xia, Yi-Ting Chen, Zarana Parekh, Hieu Pham, Quoc V. Le, Yunhsuan Sung, Zhen Li, Tom Duerig
2021-02-11, ICML 2021
- On the Origin of Implicit Regularization in Stochastic Gradient Descent [pdf] [pdf with comments]
- Samuel L. Smith, Benoit Dherin, David G. T. Barrett, Soham De
2021-01-28, ICLR 2021
- [Theoretical Properties of Deep Learning]
- Can You Trust Predictive Uncertainty Under Real Dataset Shifts in Digital Pathology? [pdf] [annotated pdf]
- Jeppe Thagaard, Søren Hauberg, Bert van der Vegt, Thomas Ebstrup, Johan D. Hansen, Anders B. Dahl
2020-09, MICCAI 2020
- [Uncertainty Estimation] [Out-of-Distribution Detection] [Medical ML]
Quite interesting and well-written paper. They compare MC-dropout, ensemlbing and mixup (and with a standard softmax classifer as the baseline). Nothing groundbreaking, but the studied application (classification of pathology slides for cancer) is very interesting. The FPR95 metrics for OOD detection in Table 4 are terrible for ensembling, but the classification accuracy (89.7) is also pretty much the same as for D_test_int in Tabe 3 (90.1)? So, it doesn't really matter that the model isn't capable of distinguishing this "OOD" data from in-distribution?
- Contrastive Training for Improved Out-of-Distribution Detection [pdf] [annotated pdf]
- Jim Winkens, Rudy Bunel, Abhijit Guha Roy, Robert Stanforth, Vivek Natarajan, Joseph R. Ledsam, Patricia MacWilliams, Pushmeet Kohli, Alan Karthikesalingam, Simon Kohl, Taylan Cemgil, S. M. Ali Eslami, Olaf Ronneberger
2020-07-10
- [Out-of-Distribution Detection]
Quite interesting and very well-written paper. They take the method from the Mahalanobis paper ("A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks") (however, they fit Gaussians only to the features at the second-to-last network layer, and they don't use the input pre-processing either) and consistently improve OOD detection performance by incorporating contrastive training. Specifically, they first train the network using just the SimCLR loss for a large number of epochs, and then also add the standard classification loss. I didn't quite get why the label smoothing is necessary, but according to Table 2 it's responsible for a large portion of the performance gain.
- Energy-based Out-of-distribution Detection [pdf] [code] [annotated pdf]
- Weitang Liu, Xiaoyun Wang, John D. Owens, Yixuan Li
2020-10-08, NeurIPS 2020
- [Out-of-Distribution Detection]
Interesting and well-written paper. The proposed method is quite clearly explained and makes intuitive sense (at least if you're familiar with EBMs). Compared to using the softmax score, the performance does seem to improve consistently. Seems like fine-tuning on an "auxiliary outlier dataset" is required to get really good performance though, which you can't really assume to have access to in real-world problems, I suppose?
- Transferring Inductive Biases through Knowledge Distillation [pdf] [code] [annotated pdf]
- Samira Abnar, Mostafa Dehghani, Willem Zuidema
2020-05-31
- [Theoretical Properties of Deep Learning]
Quite well-written and somewhat interesting paper. I'm not very familiar with this area. I didn't spend too much time trying to properly evaluate the significance of the findings.
- Learning to Simulate Complex Physics with Graph Networks [pdf] [code] [annotated pdf]
- Alvaro Sanchez-Gonzalez, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, Peter W. Battaglia
2020-02-21, ICML 2020
Quite well-written and somewhat interesting paper. Cool application and a bunch of neat videos. This is not really my area, so I didn't spend too much time/energy trying to fully understand everything.
- Neural Unsigned Distance Fields for Implicit Function Learning [pdf] [code] [annotated pdf]
- Julian Chibane, Aymen Mir, Gerard Pons-Moll
2020-10-26, NeurIPS 2020
- [Implicit Neural Representations]
Interesting and very well-written paper, I really enjoyed reading it! The paper also gives a good understanding of neural implicit representations in general.
- Synthetic Training for Accurate 3D Human Pose and Shape Estimation in the Wild [pdf] [code] [annotated pdf]
- Akash Sengupta, Ignas Budvytis, Roberto Cipolla
2020-09-21, BMVC 2020
- [3D Human Pose Estimation]
Well-written and farily interesting paper. I read it mainly as background for "Hierarchical Kinematic Probability Distributions for 3D Human Shape and Pose Estimation from Images in the Wild" which is written by exactly the same authors. In this paper, they just use direct regression. They also use silhouettes + 2D keypoint heatmaps as input (instead of edge-filters + 2D keypoint heatmaps).
- We are More than Our Joints: Predicting how 3D Bodies Move [pdf] [code] [annotated pdf]
- Yan Zhang, Michael J. Black, Siyu Tang
2020-12-01, CVPR 2021
- [3D Human Pose Estimation]
Well-written and fairly interesting paper. The marker-based representation, instead of using skeleton joints, makes sense. The recursive projection scheme also makes sense, but seems very slow (2.27 sec/frame)? I didn't quite get all the details for their DCT representation of the latent space.
- DI-Fusion: Online Implicit 3D Reconstruction with Deep Priors [pdf] [code] [annotated pdf]
- Jiahui Huang, Shi-Sheng Huang, Haoxuan Song, Shi-Min Hu
2020-12-10, CVPR 2021
- [Implicit Neural Representations]
Well-written and interesting paper, I enjoyed reading it. Neat application of implicit representations. The paper also gives a quite good overview of online 3D reconstruction in general.
- Local Implicit Grid Representations for 3D Scenes [pdf] [code] [annotated pdf]
- Chiyu Max Jiang, Avneesh Sud, Ameesh Makadia, Jingwei Huang, Matthias Nießner, Thomas Funkhouser
2020-03-19, CVPR 2020
- [Implicit Neural Representations]
Well-written and quite interesting paper. Interesting application, being able to reconstruct full 3D scenes from sparse point clouds. I didn't fully understand everything, as I don't have a particularly strong graphics background.
- NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis [pdf] [code] [annotated pdf]
- Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, Ren Ng
2020-03-19, ECCV 2020
- [Implicit Neural Representations]
Extremely well-written and interesting paper. I really enjoyed reading it, and I would recommend anyone interested in computer vision to read it as well.
All parts of the proposed method are clearly explained and relatively easy to understand, including the volume rendering techniques which I was unfamiliar with.
- Beyond Static Features for Temporally Consistent 3D Human Pose and Shape from a Video [pdf] [code] [annotated pdf]
- Hongsuk Choi, Gyeongsik Moon, Ju Yong Chang, Kyoung Mu Lee
2020-11-17, CVPR 2021
- [3D Human Pose Estimation]
- Exemplar Fine-Tuning for 3D Human Model Fitting Towards In-the-Wild 3D Human Pose Estimation [pdf] [code] [annotated pdf]
- Hanbyul Joo, Natalia Neverova, Andrea Vedaldi
2020-04-07
- [3D Human Pose Estimation]
- 3D Multi-bodies: Fitting Sets of Plausible 3D Human Models to Ambiguous Image Data [pdf] [annotated pdf]
- Benjamin Biggs, Sébastien Ehrhadt, Hanbyul Joo, Benjamin Graham, Andrea Vedaldi, David Novotny
2020-11-02, NeurIPS 2020
- [3D Human Pose Estimation]
- Learning Mesh-Based Simulation with Graph Networks [pdf] [code] [annotated pdf]
- Tobias Pfaff, Meire Fortunato, Alvaro Sanchez-Gonzalez, Peter W. Battaglia
2020-10-07, ICLR 2021
- Q-Learning in enormous action spaces via amortized approximate maximization [pdf] [annotated pdf]
- Tom Van de Wiele, David Warde-Farley, Andriy Mnih, Volodymyr Mnih
2020-01-22
- [Reinforcement Learning]
- Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling [pdf] [pdf with comments]
- Tong Che, Ruixiang Zhang, Jascha Sohl-Dickstein, Hugo Larochelle, Liam Paull, Yuan Cao, Yoshua Bengio
2020-03-12, NeurIPS 2020
- [Energy-Based Models]
- Unsupervised Learning of Visual Features by Contrasting Cluster Assignments [pdf] [code] [pdf with comments]
- Mathilde Caron, Ishan Misra, Julien Mairal, Priya Goyal, Piotr Bojanowski, Armand Joulin
2020-06-17, NeurIPS 2020
- Meta Pseudo Labels [pdf] [code] [pdf with comments]
- Hieu Pham, Zihang Dai, Qizhe Xie, Minh-Thang Luong, Quoc V. Le
2020-03-23, CVPR 2021
- No MCMC for Me: Amortized Sampling for Fast and Stable Training of Energy-Based Models [pdf] [code] [pdf with comments]
- Will Grathwohl, Jacob Kelly, Milad Hashemi, Mohammad Norouzi, Kevin Swersky, David Duvenaud
2020-10-08, ICLR 2021
- [Energy-Based Models]
- Getting a CLUE: A Method for Explaining Uncertainty Estimates [pdf] [pdf with comments]
- Javier Antorán, Umang Bhatt, Tameem Adel, Adrian Weller, José Miguel Hernández-Lobato
2020-06-11, ICLR 2021
- [Uncertainty Estimation]
- Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention [pdf] [pdf with comments]
- Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, François Fleuret
2020-06-29, ICML 2020
- [Transformers]
- Score-Based Generative Modeling through Stochastic Differential Equations [pdf] [code] [pdf with comments]
- Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole
2020-11-26, ICLR 2021
- [Neural ODEs]
- Dissecting Neural ODEs [pdf] [pdf with comments]
- Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, Hajime Asama
2020-02-19, NeurIPS 2020
- Rethinking Attention with Performers [pdf] [pdf with comments]
- Krzysztof Choromanski, Valerii Likhosherstov, David Dohan, Xingyou Song, Andreea Gane, Tamas Sarlos, Peter Hawkins, Jared Davis, Afroz Mohiuddin, Lukasz Kaiser, David Belanger, Lucy Colwell, Adrian Weller
2020-10-30, ICLR 2021
- [Transformers]
- Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images [pdf] [code] [pdf with comments]
- Rewon Child
2020-11-20, ICLR 2021
- [VAEs]
- VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models [pdf] [pdf with comments]
- Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
2020-10-01, ICLR 2021
- [Energy-Based Models] [VAEs]
- Implicit Gradient Regularization [pdf] [pdf with comments] [comments]
- David G.T. Barrett, Benoit Dherin
2020-09-23
- [Theoretical Properties of Deep Learning]
- Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness [pdf] [[pdf with comments]](https://github.com/