# Markov process

Also found in: Thesaurus, Medical, Acronyms, Encyclopedia, Wikipedia.

ThesaurusAntonymsRelated WordsSynonyms

**Legend:**Switch to new thesaurus

Noun | 1. | Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present stateMarkoff chain, Markov chain - a Markov process for which the parameter is discrete time values stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time) |

Want to thank TFD for its existence? Tell a friend about us, add a link to this page, or visit the webmaster's page for free fun content.

Link to this page: