# Markov chain

(redirected from*Markov chains*)

Also found in: Thesaurus, Medical, Encyclopedia.

## Markov chain

(ˈmɑːkɒf)*n*

(Statistics)

*statistics*a sequence of events the probability for each of which is dependent only on the event immediately preceding it[C20: named after Andrei

*Markov*(1856–1922), Russian mathematician]Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014

ThesaurusAntonymsRelated WordsSynonyms

**Legend:**Switch to new thesaurus

Noun | 1. | Markov chain - a Markov process for which the parameter is discrete time valuesMarkoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |

Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.

Want to thank TFD for its existence? Tell a friend about us, add a link to this page, or visit the webmaster's page for free fun content.

Link to this page: