A simple test suite I wrote used for testing converging on my MLX implementations of ADOPT and Amsgrad optimizers.
To add more test, first create an implementation of the function (for example, the Rosenbrock function), derived from mx.nn.Module
. Next, create a class derived from the OptimizerTest
class, and initialise the desired true optimal point and true optimal loss, as well as the desired error margins.
To run the testsuite, simply run python main.py <optimizer> <learning_rate>
, for example: python main.py "adam" "0.001"