Fisher Information Matrix computation
TorkelE opened this issue · 4 comments
The fisher information matrix (basically the hessian with respect to the cost function) is frequently used in systems biology for practical identifiability analysis. I think this would be useful to have access to for various inverse problems. In practise this should be possible to compute for most SciML inverse problems using SciMLSensitivity. I tried having a go at creating such a function myself, but didn't I didn't really manage to figure out the right sub-functions.
Would it make sense to have this? In practise, we would basically want a fisher_information_matrix
function, which when applied to a function and a input value, gives the FIM. For OptimizationFunction
s, given stuff is already there, this should pretty much just be extracting the right fields.
Does this make sense, also, where should the function be put?
Forward over forward/reverse just works, so FowrardDiff.hessian and all work, along with functionality like https://docs.sciml.ai/SciMLSensitivity/dev/manual/direct_adjoint_sensitivities/#Second-Order-Adjoint-Sensitivities
Would it make sense to create some pretty wrapper functions for this functionality?
Like what? The link that I posted is kind of the pretty wrapper function. Would you want another step above that directly named fisher_information_matrix
? That would be fine and I'd accept it.
Sounds good. I will make a tutorial on practical identifiability in the relatively soon future. I will start of using this, and then when I am done put the wrapper in. Should make it a bit more pleasant for systems biologists working with it.