Also found in: Thesaurus, Medical, Acronyms, Encyclopedia, Wikipedia.
(Statistics) statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
[C20: named after Andrei Markov (1856–1922), Russian mathematician]
Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014
Switch to new thesaurus
|Noun||1.||Markov chain - a Markov process for which the parameter is discrete time values|
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.