Here I try to implement inference (evaluation and decoding) and learning algorithms in python for Hidden Markov Models. These algorithms are fairly easy dynamic programming ones.
- python 2.7.*
- numpy >= 1.8.0
All the computations are done in log-scale for more stability and robustness. This way really small probability values can be computed effectively.
The matrix B
, the probability of each observation in each state is static and cannot change. For dynamic matrix B
one can make small edits in the evaluation algorithm. But I suggest going for a sum-product
algorithm in more complicated cases.
In learning with EM (Baum-Welch) algorithm, The observation model that is currently implemented is the multinoulli, that is in each state the observations are discrete each is generated with a static parameter.
Implement evaluation using Forward algorithm.Implement decoding using Viterbi algotithm.Implement the algorithms using the logarithm scale to make it possible to calculate very large or very small numbers.Implement learning.- Test the implementations.
- Covert to a python package.
- Create some documentation.
- Put the package on pypi.
- Implement some actually useful demos like POS tagging. (part 19.6.2.1 of Murphy's Book)
You can read more about HMM algorithms in many references including Wikipedia, paper by Rabiner or chapter 17 of Murphy's Book.