Home >  Term: Markov chain
Markov chain

A stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential. In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step n + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step n.

0 0
  • ส่วนหนึ่งของคำพูด: noun
  • อุตสาหกรรม/ขอบเขต: Weather
  • Category: Meteorology
  • Company: AMS

ผู้สร้าง

  • Kevin Bowles
  •  (Diamond) 9014 points
  • 50% positive feedback
© 2024 CSOFT International, Ltd.