English dictionary



Hint: In most browsers you can lookup any word by double click it.

English noun: Markov chain

1. Markov chain (process) a Markov process for which the parameter is discrete time values


SynonymsMarkoff chain


Broader (hypernym)Markoff process, Markov process









Based on WordNet 3.0 copyright © Princeton University.
Web design: Orcapia v/Per Bang. English edition: .
2024 onlineordbog.dk