# Markov chain

Also found in: Thesaurus, Medical, Acronyms, Encyclopedia, Wikipedia.

## Markov chain

(ˈmɑːkɒf)*n*

(Statistics)

*statistics*a sequence of events the probability for each of which is dependent only on the event immediately preceding it[C20: named after Andrei

*Markov*(1856–1922), Russian mathematician]ThesaurusAntonymsRelated WordsSynonyms

**Legend:**Switch to new thesaurus

Noun | 1. | Markov chain - a Markov process for which the parameter is discrete time valuesMarkoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |

Want to thank TFD for its existence? Tell a friend about us, add a link to this page, or visit the webmaster's page for free fun content.

Link to this page: