TuringLang/docs

Add an implementation of forward-backward algorithm for HMM models

yebai opened this issue · 6 comments

yebai commented

Hidden Markov Models are quite common in time series analysis. Since they involve discrete variables, HMC is not always appropriate, although we can still apply HMC for continuous parameters of an HMM model. It would be nice to add

  1. a customised HMM distribution that accepts transition_matrix, emit_parameters, and emit_distribution as inputs, and returns a parameterised distribution.
  2. an implementation of the forward-backwards algorithm, which computes the marginal probability of transimition_matrix, emit_parameters given some data.

References: https://mc-stan.org/docs/2_18/stan-users-guide/hmms-section.html

yebai commented

Maybe we can make use of https://github.com/maxmouchet/HMMBase.jl - if we can use this, then this is mostly about writing a new tutorial on Bayesian HMMs with marginalised latent states.

This library just released: https://github.com/probml/dynamax - might provide some inspiration

Possible with the newly released: https://github.com/gdalle/HiddenMarkovModels.jl

Something like:

@model function example_hmm_marginalized(N, K, y)
    mu ~ MvNormal([3, 10], I)
    theta1 ~ Dirichlet(softmax(ones(K)))
    theta2 ~ Dirichlet(softmax(ones(K)))
    θ = vcat(theta1', theta2')

    hmm = HMM(softmax(ones(K)), θ, [Normal(mu[1], 1), Normal(mu[2], 1)])
    _, filtered_likelihood = forward(hmm, y)

    Turing.@addlogprob! only(filtered_likelihood)
end

Example gist with some quick attempts at validating this against PosteriorDB reference draws. Seems correct:
https://gist.github.com/JasonPekos/82be830e4bf390fd1cc2886a7518aede

Thanks @JasonPekos — would you like to turn this example into a new tutorial?

Sure, I'll get around to it in a bit :)