SKIP TO CONTENT

Markoff chain

Definitions of Markoff chain
  1. noun
    a Markov process for which the parameter is discrete time values
    synonyms: Markov chain
    see moresee less
    type of:
    Markoff process, Markov process
    a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Cite this entry
Style:
MLA
  • MLA
  • APA
  • Chicago

Copy citation
DISCLAIMER: These example sentences appear in various news sources and books to reflect the usage of the word ‘Markoff chain'. Views expressed in the examples do not represent the opinion of Vocabulary.com or its editors. Send us feedback
Word Family