johnmyleswhite/julia_tutorials

not an issue/just a bit of discussion

PaulSoderlind opened this issue · 6 comments

Hi,

I took a quick look at your MLE tutorial. It looks nice. I just have one question and then a suggestion

  1. why not optimize(β -> -log_likelihood(X, y, β)...)? Because you want to reuse the nll?
  2. I have also some tutorials on MLE They include a simple example of both traditional and robust standard errors. Maybe of interest.
  1. Yes, I wanted to reuse nll in other places.
  2. I'll add a pointer to your tutorial.

Thanks,
I was merely suggesting to add something on the Hessian/gradients. Best, Paul

No worries: I think it's a great idea to link people to a discussion of robust standard errors.

This tutorial looks nice!
I am running a MLE but the likelihood function is based on a numerical integration, so I don't have equations for Hessian. It would be great if there is an example like this case.

Sounds like you need finite differences. Have you tried https://github.com/JuliaDiff/FiniteDiff.jl