/ English Dictionary |
MARKOFF CHAIN
Pronunciation (US): | (GB): |
I. (noun)
Sense 1
Meaning:
A Markov process for which the parameter is discrete time values
Synonyms:
Markoff chain; Markov chain
Classified under:
Nouns denoting natural processes
Hypernyms ("Markoff chain" is a kind of...):
Markoff process; Markov process (a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state)