metaopt/torchopt

[Feature Request] Differentiable Adan optimizer support

Benjamin-eecs opened this issue · 0 comments

Motivation

Adan optimizer is a big hit recently, good feature to support for meta learning research

Resource

Roadmap

  • low-level alias
  • high-level Optimizer
  • high-level MetaOptimizer
  • low-level test
  • high-level Optimizer test
  • api docs

Checklist

  • I have checked that there is no similar issue in the repo (required)