Markov process

noun

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous
also : markov chain

called also Markoff process

Examples of Markov process in a Sentence

Recent Examples on the Web
Examples are automatically compiled from online sources to show current usage. Read More Opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback.
The layers mirrored what often happens in a mathematical system known as a Markov process, says Hovden. Leila Sloman, Popular Mechanics, 28 Feb. 2022 This pattern is typical of systems involving a Markov process. Leila Sloman, Popular Mechanics, 28 Feb. 2022

Word History

First Known Use

1938, in the meaning defined above

Time Traveler
The first known use of Markov process was in 1938

Dictionary Entries Near Markov process

Cite this Entry

“Markov process.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20process. Accessed 29 Nov. 2024.

More from Merriam-Webster on Markov process

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!