Markoff process


Also found in: Thesaurus.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markoff process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Markoff chain, Markov chain - a Markov process for which the parameter is discrete time values
stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time)
References in periodicals archive ?
Thus its parameters depicted tail dependent variable structure according to the changes of Markoff conversion model with the time, and the serial variance subject to SWARCH model of Markoff process was introduced to determine the marginal distribution (Juan, 2007).