Markov process


Also found in: Thesaurus, Medical, Acronyms, Encyclopedia, Wikipedia.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Markoff chain, Markov chain - a Markov process for which the parameter is discrete time values
stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time)
References in periodicals archive ?
This means that the Markov process must be reversible, or that the network is assumed to be effectively non-directional.
Although the stochastic process {N(t), t [greater than or equal to] 0} is not a Markov process, we can obtain a vector Markov process by introducing a supplementary variable [I.sub.1] (t), [I.sub.2](t).
Thus, to simplify the exposition, we assume without loss of generality that [u.sub.t] is iid, mean-zero, and independent of the Markov process [s.sub.t].
Frydman H (1992) A nonparametric estimation procedure for a periodically observed three-state Markov process with application to AIDS.
In other words, [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], the expected value of f ([X.sub.t]) at time t conditional on the Markov process starting at x, i.e.
Indeed, the Google algorithm can be applied to any matrix Markov process. This short article therefore is expository in nature, and is intended for a general audience in science and technology.
That is, a Markov process is a stochastic process that the probability of the process at a state depends only on the previous state, not on the previous history of getting to the previous state.
The residual [e.sup.[pi].sub.t] in Equation (17) is assumed to follow a normal distribution with zero mean and a state-dependent variance [[sigma].sup.2.sub.st], which equals [[sigma].sup.2.sub.0] if [S.sup.[pi].sub.t]0 = 0 and equals [[sigma].sup.2.sub.1] if [S.sup.[pi].sub.t] = 1; state [S.sup.[pi].sub.t] is given by the two-state, first-order Markov process with the transition probability already described.
If the transition probabilities are equal, then a random walk process exists and the series does not follow a first-order Markov process.
In the early 1990s, a series of papers by Peyton Young, Dean Foster, Larry Blume, Michihiro Kandori, George Mailath, and Rafael Rob introduced perturbed Markov processes into evolutionary game theory.
Like Frumhoff and Reeve (1994), we assume N species related by a polytomy (i.e, star phylogeny; [ILLUSTRATION FOR FIGURE 1 OMITTED]), a single two-state character with states A and B and with state transitions modeled as a Markov process, and independent evolution in each branch leading from the common ancestor to a descendant species.
Owing to the fact that our Markov model of obstetric patient movement was validated, the flow of patients through the hospital could also be simulated as a Markov process. The probabilities in the model were extracted from the matrix of transition probabilities developed in the Markov model.