Cite
 

Dictionary


Mar'kov proc"ess



Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding. Also,Mar'koff proc"ess.

Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.

Markov chainmarksman
See also:

 

Related Content