Engelsk navneord: Markov process | |||
| 1. | Markov process (om proces) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state | ||
| Termer med samme betydning (synonymer) | Markoff process | ||
| Mindre specifikke termer | stochastic process | ||
| Mere specifikke termer | Markoff chain, Markov chain | ||