Markov chain
Jump to navigation
Jump to search
English[edit]
Noun[edit]
Markov chain (plural Markov chains)
- (probability theory) A discrete-time stochastic process containing a Markov property.
- 2004 July 27, F. Keith Barker et al., “Phylogeny and diversification of the largest avian radiation”, in PNAS, page 11040, column 2:
- The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.
Hypernyms[edit]
Hyponyms[edit]
Translations[edit]
probability theory
|