Definition: 'Markov Process'
Markov processType: Term
1. a stochastic process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
The information shown above for Markov process is provided by Stedman's.
* Stedman's, part of Lippincott Williams & Wilkins, provide a comprehensive line of health-science publications for healthcare professionals and medical students.