Markoff chain

Also found in: Medical, Encyclopedia, Wikipedia.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markoff chain - a Markov process for which the parameter is discrete time values
Markoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
Mentioned in
Copyright © 2003-2025 Farlex, Inc Disclaimer
All content on this website, including dictionary, thesaurus, literature, geography, and other reference data is for informational purposes only. This information should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional.