English dictionary



Hint: With the Firefox addon you can search this dictionary from the browsers search field.

English noun: Markov chain

1. Markov chain (process) a Markov process for which the parameter is discrete time values


SynonymsMarkoff chain


Broader (hypernym)Markoff process, Markov process









Based on WordNet 3.0 copyright © Princeton University.
Web design: Orcapia v/Per Bang. English edition: .
2026 onlineordbog.dk