Akademik

Markoff chain
noun
a Markov process for which the parameter is discrete time values
Syn: ↑Markov chain
Hypernyms: ↑Markov process, ↑Markoff process

* * *

noun
see markov chain

Useful english dictionary. 2012.