In the mathematical theory of probability, an absorbing Markov chain is a Markov chain, in which every state can reach an absorbing state. An absorbing state is a state that, once entered, can not be left.
Definition:
A Markov chain is an absorbing chain if,
1) there is at least one absorbing state and
2) it is possible to go from any state to at least one absorbing state in a finite number of steps.
In an absorbing Markov chain, a state that is not absorbing is called transient.
Canonical form
Let an absorbing Markov chain with transition matrix P have t transient states and r absorbing states. Then
where Q is a t-by-t matrix, R is a nonzero t-by-r matrix, 0 is an r-by-t zero matrix, and Ir is the r-by-r identity matrix. Thus, Q describes the probability of transitioning from some transient state to another while R describes the probability of transitioning from some transient state to some absorbing state.
吸收马尔可夫链是一种数学概率理论中用于描述在某个状态进入后无法离开的状态的链模型。本文深入探讨了吸收马尔可夫链的定义、特性以及其在数学领域的应用。
1073

被折叠的 条评论
为什么被折叠?



