JAX brings automatic differentiation and the XLA compiler together through a numpy-like API for high performance machine learning research on accelerators like GPUs and TPUs.
This is a curated list of awesome JAX libraries, projects, and other resources. Contributions are welcome!
- Neural Network Libraries
- Flax - Centered on flexibility and clarity.
- Haiku - Focused on simplicity, created by the authors of Sonnet at DeepMind.
- Objax - Has an object oriented design similar to PyTorch.
- Elegy - A framework-agnostic Trainer interface for the Jax ecosystem. Supports Flax, Haiku, and Optax.
- RLax - Library for implementing reinforcement learning agents.
- Trax - "Batteries included" deep learning library focused on providing solutions for common workloads.
- Jraph - Lightweight graph neural network library.
- Neural Tangents - High-level API for specifying neural networks of both finite and infinite width.
- NumPyro - Probabilistic programming based on the Pyro library.
- Chex - Utilities to write and test reliable JAX code.
- Optax - Gradient processing and optimization library.
- JAX, M.D. - Accelerated, differential molecular dynamics.
- Coax - Turn RL papers into code, the easy way.
- SymJAX - Symbolic CPU/GPU/TPU programming.
- mcx - Express & compile probabilistic programs for performant inference.
This section contains libraries that are well-made and useful, but have not necessarily been battle-tested by a large userbase yet.
- Neural Network Libraries
- FedJAX - Federated learning in JAX, built on Optax and Haiku.
- jax-unirep - Library implementing the UniRep model for protein machine learning applications.
- jax-flows - Normalizing flows in JAX.
- sklearn-jax-kernels -
scikit-learn
kernel matrices using JAX. - jax-cosmo - Differentiable cosmology library.
- efax - Exponential Families in JAX.
- mpi4jax - Combine MPI operations with your Jax code on CPUs and GPUs.
- imax - Image augmentations and transformations.
- Performer - Flax implementation of the Performer (linear transformer via FAVOR+) architecture.
- Reformer - Implementation of the Reformer (efficient transformer) architecture.
- Vision Transformer - Official implementation in Flax of An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale.
- Fourier Feature Networks - Official implementation of Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains.
- Flax Models - Collection of open-sourced Flax models.
- JaxNeRF - Implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis with multi-device GPU/TPU support.
- Big Transfer (BiT) - Implementation of Big Transfer (BiT): General Visual Representation Learning.
- NuX - Normalizing flows with JAX.
- kalman-jax - Approximate inference for Markov (i.e., temporal) Gaussian processes using iterated Kalman filtering and smoothing.
- GPJax - Gaussian processes in JAX.
- jaxns - Nested sampling in JAX.
- Normalizer-Free Networks - Official Haiku implementation of NFNets.
- NeurIPS 2020: JAX Ecosystem Meetup - JAX, its use at DeepMind, and discussion between engineers, scientists, and JAX core team.
- Introduction to JAX - Simple neural network from scratch in JAX.
- JAX: Accelerated Machine Learning Research | SciPy 2020 | VanderPlas - JAX's core design, how it's powering new research, and how you can start using it.
- Bayesian Programming with JAX + NumPyro — Andy Kitchen - Introduction to Bayesian modelling using NumPyro.
- JAX: Accelerated machine-learning research via composable function transformations in Python | NeurIPS 2019 | Skye Wanderman-Milne - JAX intro presentation in Program Transformations for Machine Learning workshop.
- JAX on Cloud TPUs | NeurIPS 2020 | Skye Wanderman-Milne and James Bradbury - Presentation of TPU host access with demo.
- Deep Implicit Layers - Neural ODEs, Deep Equilibirum Models, and Beyond | NeurIPS 2020 - Tutorial created by Zico Kolter, David Duvenaud, and Matt Johnson with Colab notebooks avaliable in Deep Implicit Layers.
- Solving y=mx+b with Jax on a TPU Pod slice - Mat Kelcey - A four part YouTube tutorial series with Colab notebooks that starts with Jax fundamentals and moves up to training with a data parallel approach on a v3-32 TPU Pod slice.
This section contains papers focused on JAX (e.g. JAX-based library whitepapers, research on JAX, etc). Papers implemented in JAX are listed in the Models/Projects section.
- Compiling machine learning programs via high-level tracing. Roy Frostig, Matthew James Johnson, Chris Leary. MLSys 2018. - White paper describing an early version of JAX, detailing how computation is traced and compiled.
- JAX, M.D.: A Framework for Differentiable Physics. Samuel S. Schoenholz, Ekin D. Cubuk. NeurIPS 2020. - Introduces JAX, M.D., a differentiable physics library which includes simulation environments, interaction potentials, neural networks, and more.
- Enabling Fast Differentially Private SGD via Just-in-Time Compilation and Vectorization. Pranav Subramani, Nicholas Vadivelu, Gautam Kamath. arXiv 2020. - Uses JAX's JIT and VMAP to achieve faster differentially private than existing libraries.
- Using JAX to accelerate our research by David Budden and Matteo Hessel - Describes the state of JAX and the JAX ecosystem at DeepMind.
- Getting started with JAX (MLPs, CNNs & RNNs) by Robert Lange - Neural network building blocks from scratch with the basic JAX operators.
- Tutorial: image classification with JAX and Flax Linen by 8bitmp3 - Learn how to create a simple convolutional network with the Linen API by Flax and train it to recognize handwritten digits.
- Plugging Into JAX by Nick Doiron - Compares Flax, Haiku, and Objax on the Kaggle flower classification challenge.
- Meta-Learning in 50 Lines of JAX by Eric Jang - Introduction to both JAX and Meta-Learning.
- Normalizing Flows in 100 Lines of JAX by Eric Jang - Concise implementation of RealNVP.
- Differentiable Path Tracing on the GPU/TPU by Eric Jang - Tutorial on implementing path tracing.
- Ensemble networks by Mat Kelcey - Ensemble nets are a method of representing an ensemble of models as one single logical model.
- Out of distribution (OOD) detection by Mat Kelcey - Implements different methods for OOD detection.
- Understanding Autodiff with JAX by Srihari Radhakrishna - Understand how autodiff works using JAX.
- From PyTorch to JAX: towards neural net frameworks that purify stateful code by Sabrina J. Mielke - Showcases how to go from a PyTorch-like style of coding to a more Functional-style of coding.
- Extending JAX with custom C++ and CUDA code by Dan Foreman-Mackey - Tutorial demonstrating the infrastructure required to provide custom ops in JAX.
- Evolving Neural Networks in JAX by Robert Tjarko Lange - Explores how JAX can power the next generation of scalable neuroevolution algorithms.
- Exploring hyperparameter meta-loss landscapes with JAX by Luke Metz - Demonstrates how to use JAX to perform inner-loss optimization with SGD and Momentum, outer-loss optimization with gradients, and outer-loss optimization using evolutionary strategies.
- Deterministic ADVI in JAX by Martin Ingram - Walk through of implementing automatic differentiation variational inference (ADVI) easily and cleanly with JAX.
- Evolved channel selection by Mat Kelcey - Trains a classification model robust to different combinations of input channels at different resolutions, then uses a genetic algorithm to decide the best combination for a particular loss.
Contributions welcome! Read the contribution guidelines first.