Markov process[mar′kof]

Type:Term

Definitions
1. a stochastic process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Scroll to top