david-thrower/cerebros-core-algorithm-alpha

Add new SOTA optimizers to Cerebros

david-thrower opened this issue · 0 comments

Kind of issue: feature-request-or-enhancement

Add to the API:

In the stable API:

The tf.keras.optimizers.AdamW

To the experimental API add the Lion optimizer:

https://github.com/GLambard/Lion-tensorflow

Ultimately, make the selection of Adam, AdamW, Lion a tunable param (they both have learningrate as a param. Will need an integer selection as a key referencing which one, as Katib can't directly select a class.)