互信息(Mutual information): 一种量化两个随机变量的非线性依赖性的信息理论度量方法
First, we evaluate the similarity between each connectome harmonic and each of the 7 resting state networks using mutual information, an information theoretical measure quantifying the non-linear dependence of two random variables, computed as
MI(ψj,f{
RSN})=∑vi∈Vp(ψj(vi),f{
RSN}(vi))logp(ψj(vi),fRSN(vi))p(ψj(vi))⋅p(fRSN(vi)) \operatorname{MI}\left(\psi_{j}, \mathrm{f}_{\{\mathrm{RSN}\}}\right)=\sum_{v_{i} \in \mathcal{V}} p\left(\psi_{j}\left(v_{i}\right), \mathrm{f}_{\{\mathrm{RSN}\}}\left(v_{i}\right)\right) \log \frac{p\left(\psi_{j}\left(v_{i}\right), f_{\mathrm{RSN}}\left

该文介绍了互信息作为衡量两个随机变量非线性依赖性的信息理论工具,并给出在MATLAB中计算互信息的场景,特别是在评估连接组谐波与休息状态网络相似性中的应用。
订阅专栏 解锁全文
879

被折叠的 条评论
为什么被折叠?



