English dictionary
markov chain meaning and definition
Definition and meaning of markov chain at MeaningMonkey.org. markov chain meaning and definition in the English Dictionary.MARKOV CHAIN noun
Definition of Markov chain (noun)
- a Markov process for which the parameter is discrete time values
- synonyms: Markoff chain
Source: Princeton University Wordnet