Definition: 'Markov Process'
Markov processType: Term
1. a stochastic process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
The definition information for Markov process is provided by Stedman's. You can search our medical dictionary here.