Akademik

Markov process
noun
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Syn: ↑Markoff process
Hypernyms: ↑stochastic process
Hyponyms: ↑Markov chain, ↑Markoff chain

* * *

noun also markoff process
Usage: usually capitalized M
Etymology: after Andrei Andreevich Markov
: a stochastic process (as Brownian movement) that resembles a Markov chain except that the states are continuous ; also : markov chain

* * *

Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Also, Markoff process.
[1935-40; after Russian mathematician Andrei Andreevich Markov (1856-1922), who developed it]

* * *

Markov process,
any process based on a Markov chain: »

The interconnection between classical potential theory and Brownian motion depends heavily on the fact that Brownian motion is a Markov process, that is, its present behavior is not influenced by its past behavior (Scientific American).


Useful english dictionary. 2012.