Markov chain

(redirected from Markov chains)
Also found in: Thesaurus, Medical, Encyclopedia.

Markov chain

(ˈmɑːkɒf)
n
(Statistics) statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
[C20: named after Andrei Markov (1856–1922), Russian mathematician]
Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markov chain - a Markov process for which the parameter is discrete time values
Markoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
References in periodicals archive ?
They begin with the basics of random variables and probability theory, then go on to limit theorems, Markov chains, diffusion processes, and random fields.
First let me explain Markov chains. And then explain why HCLOS delivers a better outcome.
Using phase-type distribution results for Markov chains [10], we obtain an explicit formula for calculating the expected event rate [[lambda].sub.Y] to reach the absorbing state Y, given the initial probability row vector [pi] for the transient states [mathematical expression not reproducible], as
In [26], under a switching topology governed by Markov chains, the consensus seeking problem was solved through a guaranteed cost control method.
Markov Processes and Markov Chains; Transition Probability Matrix
Therefore, it is interesting to derive formulas for [p.sub.j] and [d.sub.j] for Markov chains that are as close as we want to the diffusion process.
Series expansion techniques for Markov chains go by different names in literature, including perturbation techniques, the power series algorithm, and light-traffic approximations.
In this article, we consider the more general setting of Markov chains. The proofs are similar as those in [18], but the results are valid in a broader context and can be formulated more clearly.
Lithofacies cyclicity determination in the guaduas formation (Colombia) using Markov chains. Earth Sciences Research Journal, 20(3), B1-B9.
This will require a custom tool which can solve Markov chains for state probabilities for each flight while mapping over the maintenance action effect on state probabilities from flight to flight.
The discrete-time Markov chains are used to solve the random medium access probabilities of all user priorities.