Akademik

Markoff process
noun
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Syn: ↑Markov process
Hypernyms: ↑stochastic process
Hyponyms: ↑Markov chain, ↑Markoff chain

* * *

noun
see markov process

Useful english dictionary. 2012.