Also found in: Thesaurus, Medical, Acronyms, Encyclopedia, Wikipedia.
(Statistics) statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
[C20: named after Andrei Markov (1856–1922), Russian mathematician]
Switch to new thesaurus
|Noun||1.||Markov chain - a Markov process for which the parameter is discrete time values|