Pinned Repositories
adawhatever
MATLAB implementation of AdaGrad, Adam, Adamax, Adadelta etc.
bayesian-stat-machine-learning
"Bayesian Statistics & Machine Learning" Reading Group at Northwestern Statistics
BCD-for-DNNs
Training Deep Neural Networks using Block Coordinate Descent Algorithms
BCD-for-DNNs-PyTorch
Code for Global Convergence of Block Coordinate Descent in Deep Learning (ICML 2019)
BCoAPG-plus
MATLAB Code for the paper "Accelerated Block Coordinate Proximal Gradients with Applications in High Dimensional Statistics"
BigDataAlgorithms
bregman_prox_langevin_mc
Code for Bregman Proximal Langevin Monte Carlo via Bregman--Moreau Envelopes (ICML 2022)
d2l-pytorch
This project reproduces the book Dive Into Deep Learning (www.d2l.ai), adapting the code from MXNet into PyTorch.
deep-learning-notes
Experiments with Deep Learning
Deep-Reinforcement-Learning-Algorithms-with-PyTorch
timlautk's Repositories
timlautk/BCD-for-DNNs-PyTorch
Code for Global Convergence of Block Coordinate Descent in Deep Learning (ICML 2019)
timlautk/bayesian-stat-machine-learning
"Bayesian Statistics & Machine Learning" Reading Group at Northwestern Statistics
timlautk/BCD-for-DNNs
Training Deep Neural Networks using Block Coordinate Descent Algorithms
timlautk/BCoAPG-plus
MATLAB Code for the paper "Accelerated Block Coordinate Proximal Gradients with Applications in High Dimensional Statistics"
timlautk/bregman_prox_langevin_mc
Code for Bregman Proximal Langevin Monte Carlo via Bregman--Moreau Envelopes (ICML 2022)
timlautk/d2l-pytorch
This project reproduces the book Dive Into Deep Learning (www.d2l.ai), adapting the code from MXNet into PyTorch.
timlautk/deep-learning-notes
Experiments with Deep Learning
timlautk/Deep-Reinforcement-Learning-Algorithms-with-PyTorch
timlautk/DeepRL
Modularized Implementation of Deep RL Algorithms in PyTorch
timlautk/Duke-Tsinghua-MLSS-2017
Duke-Tsinghua Machine Learning Summer School 2017
timlautk/FAR-HO
Gradient based hyperparameter optimization & meta-learning package for TensorFlow
timlautk/higher
higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
timlautk/Keras-GAN
Keras implementations of Generative Adversarial Networks.
timlautk/lmc-atomi
Code for Non-Log-Concave and Nonsmooth Sampling via Langevin Monte Carlo Algorithms
timlautk/MAML-Pytorch
Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
timlautk/MAML-TensorFlow
Faster and elegant TensorFlow Implementation of paper: Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
timlautk/modded-nanogpt
NanoGPT (124M) in 5 minutes
timlautk/models
Models and examples built with TensorFlow
timlautk/normalizing-flows-tutorial
Tutorial on normalizing flows.
timlautk/probability
Probabilistic reasoning and statistical analysis in TensorFlow
timlautk/PyTorch-Tutorial
Build your neural network easy and fast
timlautk/PyTorchZeroToAll
Simple PyTorch Tutorials Zero to ALL!
timlautk/Roadmap-of-DL-and-ML
Roadmap of DL and ML, some courses, study notes and paper summary
timlautk/SGDLibrary
Matlab library for stochastic gradient descent algorithms: Version 1.0.12
timlautk/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10
How to train a TensorFlow Object Detection Classifier for multiple object detection on Windows
timlautk/Tensorflow-Tutorial
Tensorflow tutorial from basic to hard
timlautk/tensorpack
A Neural Net Training Interface on TensorFlow
timlautk/tutorial-rl
timlautk/tutorials
timlautk/variational-inference-with-normalizing-flows
Reimplementation of Variational Inference with Normalizing Flows (https://arxiv.org/abs/1505.05770)