Markov chain

(redirected from Discrete-time Markov chain)
Also found in: Thesaurus, Medical, Acronyms, Encyclopedia.

Markov chain

(ˈmɑːkɒf)
n
(Statistics) statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
[C20: named after Andrei Markov (1856–1922), Russian mathematician]
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markov chain - a Markov process for which the parameter is discrete time values
Markoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
References in periodicals archive ?
In Section 4 we shall give conditions under which a discrete-time Markov chain ([X.
Then, the three-dimensional process {s(t), b(t), z(t)} can be modelled as a discrete-time Markov chain (DTMC).
6 CSMA/CA medium access control, and Section 3 provides a discrete-time Markov chain model for the analysis of the IEEE 802.

Full browser ?