optax
There are 46 repositories under optax topic.
phlippe/uvadlc_notebooks
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2023
gordicaleksa/get-started-with-JAX
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
BorealisAI/flora-opt
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors" in ICML 2024.
varun-ml/diffusion-models-tutorial
Experiment with diffusion models that you can run on your local jupyter instances
evanatyourservice/psgd_jax
Implementation of PSGD optimizer in JAX
hamishs/JAX-RL
JAX implementations of various deep reinforcement learning algorithms.
hushon/JAX-ResNet-CIFAR10
Simple CIFAR10 ResNet example with JAX.
bsc-quantic/tn4ml
Tensor Networks for Machine Learning
JesseFarebro/flax-mup
Maximal Update Parametrization (μP) with Flax & Optax.
yonesuke/jaxfss
JAX/Flax implementation of finite-size scaling
frankroeder/goal_conditioned_rl
Goal-conditioned reinforcement learning like 🔥
Ceyron/trainax
Training methodologies for autoregressive neural operators/emulators in JAX.
hr0nix/optax-adan
An implementation of adan optimizer for optax
mzguntalan/h-former
H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh.
qdevpsi3/quantum-orthogonal-nn
JAX implementation of Classical and Quantum Algorithms for Orthogonal Neural Networks by (Kerenidis et al., 2021)
NITHISHM2410/flax-pilot
A Simplistic trainer for Flax
salfaris/vgae-jax
Variational Graph Autoencoder implemented using Jax & Jraph
ysngshn/ivon-optax
An Optax-based JAX implementation of the IVON optimizer for large-scale VI training of NNs (ICML'24 spotlight)
Raffaelbdl/hyperbolic-nn-haiku
dm-haiku implementation of hyperbolic neural networks
ethanluoyc/td3_bc_jax
Direct port of TD3_BC to JAX using Haiku and optax.
evanatyourservice/sophia-schedulefree-jax
Sophia optimizer with ScheduleFree
amoudgl/celo
Code for Celo: Training Versatile Learned Optimizers on a Compute Diet
NTT123/wavernn-16bit
The (unofficial) vanilla version of WaveRNN
activatedgeek/optax-swag
Stochastic Weight Averaging (SWA) transforms for Optax with JAX
bischtob/Opterax
An gradient-free optimization suite written in JAX. We conform to the optax interface and provide ensemble-based optimizers.
daniel-j-h/nedem
Neural implicit digital elevation model
evanatyourservice/flat-sophia
sophia optimizer further projected towards flat areas of loss landscape
salfaris/PriorVGAE
Oxford MSc thesis. variational autoencoder combined with graph convolutional networks for learning locally-aware spatial prior distributions
elttaes/VAE-MNIST-Haiku-Jax
Simple VAE example with Jax.
NTT123/haiku_trainer
A helper library for training dm-haiku models.
DavidUlloa6310/optimizers
Simple implementations of most popular optimizers like Adam, RMSProp, Adagrad, and SGD. Optimizers benchmarked against MNIST, Cifar, and IMDB dataset as referenced in the Adam paper.
p-nordmann/optax-zclip
ZClip implementation for Optax.
jeertmans/HER-with-JAX
Learning Hindsight Experience Replay (HER) with JAX
mrhashemi/Neural_Net_JAX_Optax_Keras
A simple neural network is created using JAX with optax optimizer and a custom defined loss function.
qpsy/uvadlc_flax
flax.nnx implementation of the UvA Deep Learning Tutorials
suzuki-2001/evorca
Fast and minimal plmDCA in JAX.