mle-autodiff
Experiments using Automatic Differentiation with DiffSharp for Maximum Likelihood Estimation.
Notebooks
01_simple_weibull.dib
See blog post.
02_optimizers.dib
Exploring function minimization using the built-in SGD and Adam optimizers.
Questions
- Terminate gradient descent with built-in optimizers
- Using built-in optimizers with functions taking mixed types (ex: bool and float)
- Built-in optimizers: set objective to minimization