参考论文
一、原文
HMM is also a maneuver-based method that uses Markov Chain.
state transition probability:
P(Sn+1=s ∣ Sn=sn)P(S_{n+1}=s~|~ S_n=s_n)P(Sn+1=s ∣ Sn=sn)
In real life, we can onlly observe the distinct (明显的) state that is exposed on the surface, but no intuitive representation of its hidden states exists (而不是没有直观展示的隐藏状态). ---- 隐藏的状态观测不到
二、Hidden Markov Model
HMM is represented by (S,O,A,B,πS,O,A,B, \piS,O,A,B,π):
- S={S1,S2,...,SN} S=\{S_1,S_2,...,S_N\}~S={S1,S2,...,SN} : hidden state sequence
- O={O1,O2,...,ON} O=\{O_1,O_2,...,O_N\}~O={O1,O2,...,ON} : observation sequence
- AAA : transition probability matrix between hidden states
- BBB : output matrix,representing transition probability of hidden states to output state(表示隐藏状态到输出状态的转移概率)
- π\piπ : initial probability matrix,representing the initial
probability distribution in hidden states.
Application in trajectory prediction
OOO : historical states of traffic participants
该论文构造HMM的意图识别问题,用forward algorithm求解。
三、forward algorithm
参考链接 https://zhuanlan.zhihu.com/p/359831957
变量含义
QQQ
3个基本问题
(1)概率计算问题
在给定模型λ=(A,B,π)\lambda=(A,B,\pi)λ=(A,B,π)
(2)学习问题
(3)预测问题/解码问题
定义
给定隐马尔可夫模型 λ\lambdaλ,定义到时刻 ttt 的部分观测序列为 o1,o2,...,oto_1,o_2,...,o_to1,o2,...,ot,且状态为 qiq_iqi 的概率为前向概率,记作
αt(i)=P(o1,o2,...,ot,it=qi∣λ)\alpha_t(i)=P(o_1,o_2,...,o_t,i_t=q_i|\lambda)αt(i)=P(o1,o2,...,ot,it=qi∣λ)。
可递推地求得前向概率 αt(i)\alpha_t(i)αt(i) 及观测序列概率 P(O∣λ)P(O|\lambda)P(O∣λ)。

解释:



1763

被折叠的 条评论
为什么被折叠?



