/ English Dictionary |
MARKOV CHAIN
Pronunciation (US): | (GB): |
I. (noun)
Sense 1
Meaning:
A Markov process for which the parameter is discrete time values
Synonyms:
Markoff chain; Markov chain
Classified under:
Nouns denoting natural processes
Hypernyms ("Markov chain" is a kind of...):
Markoff process; Markov process (a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state)