贝叶斯神经网络与物联网天线研究
1. 贝叶斯神经网络相关研究
1.1 多层神经网络结构
在多层神经网络中,隐藏层采用 ReLUs 激活函数,其表达式为 (a(x) = \max(x, 0))。概率表达式为 (p(y|W, X, \gamma) = \prod_{n = 1}^{N} N (y_n|z_L(x_n|W), \gamma^{-1}) \equiv f_n),先验概率分别为 (p(W|\lambda) = \prod_{l = 1}^{L} \prod_{i = 1}^{V_l} \prod_{j = 1}^{V_{l - 1} + 1} N (w_{ij, l} |0, \lambda^{-1}) \equiv g_k),(p(\lambda) = Gamma(\lambda|\alpha_{\lambda 0}, \beta_{\lambda 0}) \equiv h),(p(\gamma) = Gamma(\gamma|\alpha_{\gamma 0}, \beta_{\gamma 0}) \equiv s)。网络有 (L) 层,权重矩阵为 (W = {W_l} {l = 1}^{L}),输出为 (Z_L)。后验估计为 (q(W, \gamma, \lambda) = h\prod {l = 1}^{L} \prod_{i = 1}^{V_l} \prod_{j = 1}^{V_{l - 1} + 1} N (w_{ij, l} |m_{ij, l}, v_{ij, l}) \cdot Gamma(\gamma|\alpha_{\gamma}, \beta_{\gamma}) \cdot Gamma(\lambda|\alpha_{\lambda}, \b
超级会员免费看
订阅专栏 解锁全文
33

被折叠的 条评论
为什么被折叠?



