google-deepmind/optax
Optax is a gradient processing and optimization library for JAX.
PythonApache-2.0
Pinned issues
Issues
- 3
Performance issue: multi_transform and set_to_zero don't prevent computation
#993 opened by YanisJouanaud - 11
Timeline for JaxOpt migration
#977 opened by Joshuaalbert - 4
Passing arguments to train multiple models in parallel
#932 opened by kclauw - 1
Support for CSR format sparse matrix in optimizer?
#994 opened by MoFHeka - 2
Conversion to TFLite failed
#1047 opened by JuanFMontesinos - 5
- 4
- 1
- 2
Allow RMSProp use same scaling as adam without momentum + make schedule_free_adamw use rmsprop directly (to spare one memory slot))
#1006 opened by vroulet - 1
- 3
Documentation for cosine decay schedule
#905 opened by gjhuizing - 2
- 2
Feature request: GaLore optimizer
#1028 opened by gil2rok - 10
- 1
LBFGS not working for custom classes
#1017 opened by Michaelhess17 - 10
- 4
- 7
Intended usage of the Sophia optimiser
#968 opened by vvvm23 - 0
Remove multi_normal from docs (in utilities.rst) or add dosctring to multi_normal (in utils.py).
#842 opened by vroulet - 4
Inconsistencies in schedules API
#835 opened by fabianp - 0
Add an assignment problem solver
#954 opened by carlosgmartin - 1
Stochasticsm in Adam
#996 opened by Cattaneo123 - 1
Loose dependency allows chex version without 'warn_deprecated_function' to be installed
#995 opened by phcavelar - 2
- 1
Add ACProp
#965 opened by carlosgmartin - 2
Memory leak in optax.radam
#969 opened by lukekulik - 1
unitwise_norm fails for 3D convolutions
#906 opened by froody - 3
How to Cleanly Specify Optimizer, Schedule, & Gradient Clipping with inject_hyperparams
#964 opened by timothyas - 1
- 3
Jaxopt vs Optax
#959 opened by pushkar5586 - 21
- 2
Add a `nesterov` flag to `radam` optimizer
#944 opened by carlosgmartin - 2
README: deadlink to "Optax 101" notebook
#937 opened by jeertmans - 5
- 2
- 1
Implement Schedule-Free Learning
#910 opened by ameya98 - 2
- 2
Reccomendation for the `eps_root` setting for differentiating through the Adam optimizer
#882 opened by itk22 - 0
Error in the lookahead optimizer example
#860 opened by fabianp - 0
Move linear algebra operations into utilities
#831 opened by fabianp - 3
Broken tarball for release `v0.2.0`
#854 opened by daskol - 1
"dadapt_adamw, combine, prodigy" occur "TypeError: 'type' object is not subscriptable."
#849 opened by sbb2002 - 5
incompatibility between zero_nans() and MultiSteps
#828 opened by TheMr33 - 0
Math not being displayed in ogda_example.ipynb
#829 opened by fabianp - 0
Bad docstring formatting for SAM
#826 opened by fabianp - 2
remove quickstart example (https://github.com/google-deepmind/optax/blob/main/examples/quick_start.ipynb), now that it has been merged into optax-101.ipynb
#809 opened by fabianp - 1
move DPSGD docs to the contrib/ section
#773 opened by fabianp - 2
- 1
slow examples are not really executed
#786 opened by fabianp - 1
Type annotation for `params` in function `net`
#784 opened by yixiaoer