Pinned Repositories
AdaptiveNormsNewton
Source code for Kohler, Jonas et al. "Adaptive Norms for Deep Learning with Regularized Newton Methods", NeurIPS 2019 Workshop "Beyond 1st-order methods in Machine Learning"
Batchnorm_prevents_rank_collpase
Source code for Daneshmand, Hadi et al. "Batchnormalization provably prevents rank collapse in randomly initialized deep networks", NeurIPS 2020
Efficient-subsampling-in-python
escaping_saddles
escaping_saddles_with_stochastic_gradients
Source code for Daneshmand, H., Kohler, J., Lucchi, A., & Hofmann, T. (2018). Escaping saddles with stochastic gradients. ICML 2018
flatland-training
Experiments with flatland.aicrowd.com
jonaskohler.github.io
simple_GAN_optimization_w_extra_gradients
2D toy examples for minmax problems with extra gradient descent.
stereoEEG2speech
Code for a seq2seq architecture with Bahdanau attention designed to map stereotactic EEG data from human brains to spectrograms, using the PyTorch Lightning.
subsampled_cubic_regularization
Source code for Kohler & Lucchi "Sub-sampled Cubic Regularization for Non-convex Optimization" ICML 2017
jonaskohler's Repositories
jonaskohler/stereoEEG2speech
Code for a seq2seq architecture with Bahdanau attention designed to map stereotactic EEG data from human brains to spectrograms, using the PyTorch Lightning.
jonaskohler/Batchnorm_prevents_rank_collpase
Source code for Daneshmand, Hadi et al. "Batchnormalization provably prevents rank collapse in randomly initialized deep networks", NeurIPS 2020
jonaskohler/AdaptiveNormsNewton
Source code for Kohler, Jonas et al. "Adaptive Norms for Deep Learning with Regularized Newton Methods", NeurIPS 2019 Workshop "Beyond 1st-order methods in Machine Learning"
jonaskohler/Efficient-subsampling-in-python
jonaskohler/escaping_saddles
jonaskohler/escaping_saddles_with_stochastic_gradients
Source code for Daneshmand, H., Kohler, J., Lucchi, A., & Hofmann, T. (2018). Escaping saddles with stochastic gradients. ICML 2018
jonaskohler/flatland-training
Experiments with flatland.aicrowd.com
jonaskohler/jonaskohler.github.io
jonaskohler/simple_GAN_optimization_w_extra_gradients
2D toy examples for minmax problems with extra gradient descent.
jonaskohler/subsampled_cubic_regularization
Source code for Kohler & Lucchi "Sub-sampled Cubic Regularization for Non-convex Optimization" ICML 2017
jonaskohler/minimizing_quadratics
jonaskohler/SDP
semidefinite programming for unconstrained optimization
jonaskohler/SentimentAnalysis
part of the work we did at a KD seminar at my university