google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
PythonApache-2.0
Issues
- 1
Add ACProp
#965 opened by carlosgmartin - 2
Memory leak in optax.radam
#969 opened by lukekulik - 0
Timeline for JaxOpt migration
#977 opened by Joshuaalbert - 6
Intended usage of the Sophia optimiser
#968 opened by vvvm23 - 2
Documentation for cosine decay schedule
#905 opened by gjhuizing - 1
unitwise_norm fails for 3D convolutions
#906 opened by froody - 6
- 3
How to Cleanly Specify Optimizer, Schedule, & Gradient Clipping with inject_hyperparams
#964 opened by timothyas - 1
- 3
Jaxopt vs Optax
#959 opened by pushkar5586 - 21
- 0
Add an assignment problem solver
#954 opened by carlosgmartin - 2
Add a `nesterov` flag to `radam` optimizer
#944 opened by carlosgmartin - 4
Passing arguments to train multiple models in parallel
#932 opened by kclauw - 2
README: deadlink to "Optax 101" notebook
#937 opened by jeertmans - 0
Add a mathematical description of the algorithms
#757 opened by vroulet - 5
- 2
- 1
Implement Schedule-Free Learning
#910 opened by ameya98 - 2
- 2
Reccomendation for the `eps_root` setting for differentiating through the Adam optimizer
#882 opened by itk22 - 0
Error in the lookahead optimizer example
#860 opened by fabianp - 0
Inconsistencies in schedules API
#835 opened by fabianp - 0
Move linear algebra operations into utilities
#831 opened by fabianp - 8
- 3
Broken tarball for release `v0.2.0`
#854 opened by daskol - 1
"dadapt_adamw, combine, prodigy" occur "TypeError: 'type' object is not subscriptable."
#849 opened by sbb2002 - 5
incompatibility between zero_nans() and MultiSteps
#828 opened by TheMr33 - 0
Remove multi_normal from docs (in utilities.rst) or add dosctring to multi_normal (in utils.py).
#842 opened by vroulet - 0
Math not being displayed in ogda_example.ipynb
#829 opened by fabianp - 0
Bad docstring formatting for SAM
#826 opened by fabianp - 2
remove quickstart example (https://github.com/google-deepmind/optax/blob/main/examples/quick_start.ipynb), now that it has been merged into optax-101.ipynb
#809 opened by fabianp - 1
move DPSGD docs to the contrib/ section
#773 opened by fabianp - 2
- 3
Optax is incompatible with jax==0.4
#694 opened by wqlevi - 0
examples mlp_mnist.ipynb and differentially_private_sgd.ipynb appear on the gallery but don't appear on the side menu.
#767 opened by fabianp - 1
slow examples are not really executed
#786 opened by fabianp - 1
Type annotation for `params` in function `net`
#784 opened by yixiaoer - 2
Remove license statement from notebooks
#755 opened by fabianp - 6
Align equations left in the docs
#758 opened by vroulet - 2
- 2
Remove redundant documentation from README.md
#753 opened by fabianp - 2
Statistical Adaptive Stochastic Gradient Methods
#703 opened by fabianp - 1
pytype errors when running tests locally
#728 opened by fabianp - 0
Add the tree_util module to the API documentation
#685 opened by fabianp - 7
Replace tree_util with tree_utils in docstrings
#684 opened by fabianp - 1
L1 Loss
#700 opened by MaanasArora - 1
- 1
Python version for optax?
#690 opened by ahmadmustafaanis - 1
Create an example using reduce_on_plateau
#679 opened by fabianp