Akademik
Markov chain
noun
a Markov process for which the parameter is discrete time values
•
Syn
: ↑
Markoff chain
•
Hypernyms
: ↑
Markov process
, ↑
Markoff process
Useful english dictionary
.
2012
.