/transition-matrix

The transition matrix of a Markov chain is a square matrix that describes the probability of transitioning from one state to another.

Primary LanguageC++MIT LicenseMIT

Watchers