/DeepLearn-Theory

Topics on theoretical, mathematical aspects of DL

DeepLearn-Theory

Reading Group on Topics on theoretical, mathematical aspects of DL. NYU, Fall 2016. Moderator: J. Bruna.

The purpose of this reading group is to define good open problems that relate Deep Learning models with aspects of statistics, applied maths and physics. We are particularly interested in connections with statistical physics, optimization and harmonic analysis. Everyone is welcome.

##Information Thursdays at 5:30pm, Center for Data Science, NYU. 60 5th ave, 6th floor, Room 606, 7th floor open-space.
##Logistics The goal is that each week a designated person(s) will present a selected paper, and possibly a bit of the mathematical context that is required to address it.

##Tentative List of Topics:

  • Statistical Physics, Maximum Entropy
  • Unsupervised Learning for Images and Time Series.
  • Stochastic Optimization and Stability.
  • Gradient Descent, bassins of Attraction and Tensor Analysis.
  • Graph Theory, Invariance Groups and Convolutions.
  • Bandits.

##Tentative Agenda:

##Pool of Papers/Books [please fill]

  • Les Houches Ellis Statistical Physics.
  • Gibbs Models and Sampling.
  • Renormalization Group (RG)
  • Learn faster generalize better
  • Draft Microcanonical Mixtures (JB)
  • Bassins of Attraction Shamir
  • Randomized PCA (Tygert et Al)