1. Motivation of inception: To make deeper and wider CNN
comes with two drawbacks: limitation of examples and increasing computational resources. ==> need to move from fully connected to sparsely connected architectures.
2. Solutions:
i. use 1 x 1 convolutions before 3 x3 and 5 x 5 ==> reduce redundant infor and reduce dim ( 'keep the network sparse at most places and compress the signals only whenever they have
to be aggregated')
ii. introduce loss functions at hidden layers during training process
a. the model will converge faster (overthrown latter?)
b. loss of final layer: loss of auxiliary classifiers = 1: 0.3
本文探讨了Inception网络设计背后的动机:如何使卷积神经网络既深又宽的同时避免资源限制和过拟合问题。文章提出了两个解决方案:一是通过1x1卷积减少冗余信息并压缩特征;二是引入辅助损失函数加速训练过程。
812

被折叠的 条评论
为什么被折叠?



