论文阅读-A Fast Learning Algorithm for Deep Belief Nets

作者: G.E.Hinton et. al.
日期: 2006
类型: article
来源: Neural Computation
评价: Deep Learning eve(深度学习前夕)
论文链接: http://www.cs.toronto.edu/~hinton/absps/ncfast.pdf
文章比较"硬核", 各种算法原理解释, 数学公式和术语. 而且作者真的是很喜欢用一些生物上的词汇来描述模型, 比如synapse strength(突触强度), mind(头脑), 导致读的时候很困惑(挠头). 需要有RBM(受限玻尔兹曼机)和wake-sleep 算法的基础. 恰好我没有, 读的很困难, 笔记只做了简单的梳理和摘抄.

1 Purpose

  1. To design a generative model to surpass discriminative models.
  2. To train a deep, densely connected belief network efficiently.
    The explaining-away effects make inference difficult in densely connected belief nets that has many hidden layers.
  • challenges
  1. It’s difficult to infer the conditional distribution of the hidden activities when given a data vector.
  2. Variational methds(变分方法) use simple approximations to teh true conditional distribution, but the approximation may be poor, especially at the deepest hidden layer, where the prior assumes independence.
  3. Variational learning still requires all of the parameters to be learned together and this make the learning time scale poorly (extreme time consuming?) as the number of parameters increase.

2 The previous work

  1. Back propagation nets
  2. support vector machines

3 The proposed method

  1. The authors designed A hybrid model, in which its top two hidden layers form an undirected associative memory, and the remaining hidden layers form a directed acyclic graph that converts the representations in the associative memory into observable variables such as the pixels of an image.
    在这里插入图片描述
    my understanding
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值