English dictionary



Info: This web site is based on WordNet 3.0 from Princeton University.

English noun: Markov chain

1. Markov chain (process) a Markov process for which the parameter is discrete time values


SynonymsMarkoff chain


Broader (hypernym)Markoff process, Markov process









Based on WordNet 3.0 copyright © Princeton University.
Web design: Orcapia v/Per Bang. English edition: .
2025 onlineordbog.dk