This demo visualizes several MATLAB derivative-free optimizers at work on standard test functions. This is purely for demonstration purposes. For a proper benchmark of different MATLAB optimizers, see [1].
Follow me on Twitter for updates about other projects I am involved with, or drop me an email at luigi.acerbi@unige.ch to talk about computational modeling, optimization, and (approximate) Bayesian inference.
I have been giving seminars and tutorials on optimization, model fitting, and model comparison. If you are interested, see my webpage.
The optimization algorithms visualized here are:
- BADS (Bayesian adaptive direct search), a novel algorithm that combines a direct search approach with local Bayesian optimization (link);
fminsearch
(Nelder-Mead), the standard simplex method for nonlinear optimization;fmincon
, a powerful method for constrained optimization based on numerical approximation of the gradient;ga
(genetic algorithms), a heuristic population-based method for global optimization;- MCS (Multi-level coordinate search), an advanced method for global optimization (link);
- CMA-ES (Covariance matrix adaptation - evolution strategies), a state-of-the-art method for nonconvex optimization (link).
We see here an example on the Rosenbrock banana function:
We see how the algorithms react to noise, by adding unit Gaussian noise at each function evaluation:
We see here another noiseless example on the Ackley function:
- BADS works well on these examples, which were chosen to show how different algorithms explore the space. More generally, BADS is best for functions with a noisy or jagged landscape, and with non-negligible computational cost (see here). BADS is available as a ready-to-use MATLAB toolbox here.
fminsearch
is a generic optimizer which can deal with simple functions, but it should never be the main choice as there are always better alternatives.fmincon
is generally superior to most optimizers (and in partcular, tofminsearch
) on smooth functions. However,fmincon
deals very badly with jagged or noisy landscapes.- We are not aware of scenarios in which
ga
is a good off-the-shelf choice for continuous-valued optimization. It is often just barely better than random search. - MCS can be a great optimizer, but it is somewhat idiosyncratic (it might converge very quickly to a solution).
- CMA-ES, despite the poor performance shown here, is a good optimizer if allowed a very large number of function evaluations.
These animated gifs can be generated via the optimviz.m
function. You can easily test different optimizers and add other functions.
The generated animated gifs are uncompressed. We recommend to compress them before using them in any form (e.g., via some online tool).
To run some of these algorithms you will need MATLAB's Optimization Toolbox and Global Optimization Toolbox.
For more details about the benchmark comparing different MATLAB optimizers on artificial and real applied problems (fitting of computational models), see the following reference:
- Acerbi, L. & Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In Advances in Neural Information Processing Systems 30, pages 1834-1844. (link, arXiv preprint)
For more info about my work in machine learning and computational neuroscience, follow me on Twitter: https://twitter.com/AcerbiLuigi
OptimViz is released under the terms of the GNU General Public License v3.0.