在深度学习中经常听闻Bottleneck Layer 或 Bottleneck Features ,亦或 Bottleneck Block,其虽然容易理解,其意思就是输入输出维度差距较大,就像一个瓶颈一样,上窄下宽亦或上宽下窄,然而其正儿八经的官方出处没有一个能说出其所以然来,下面本文将对Bottleneck Layer 或 Bottleneck Features ,亦或 Bottleneck Block追根溯源,使之对Bottleneck Layer 或 Bottleneck Features ,亦或 Bottleneck Block有一个全面的认识。
首先来看一篇关于深度神经网络的有效处理 的综述文章对Bottleneck Building Block的表述:“In order to reduce the number of weights, 1x1 filters are applied as a “bottleneck” to reduce the number of channels for each filter”,在这句话中,1x1 filters 最初的出处即"Network In Network",1x1 filters 可以起到一个改变输出维数(channels)的作用(elevation or dimensionality reduction)。下面来看一下ResNet对Bottleneck Building Block的表述:
其对应的图示如下:
可以看到,右图中 1x1 filters把维度(channels)升高了,输入输出维度差距较大。继续如下图所示:
直观的理解是这玩意儿应该是用来降维用的,没错,那为什么用它比较好呢,另一篇论文[2]给了解释:
If we do not want to use the dimensionality reduction techniques, and want to obtain the features suitable for the classification as outcome of neural net training process, a bottle-neck has to be created in the neural net structure. The neural net has the ability of nonlinear compression of the input features and of classification of such compressed features. If the trained neural net with bottle-neck has a good classification accuracy, we know that the bottle-neck outputs represents the underlying speech well.(感兴趣的可以看看论文的背景,这样比较好理解)
-
Efficient Processing of Deep Neural Networks: A Tutorial and Survey Vivienne Sze, Senior Member, IEEE, Yu-Hsin Chen, Student Member, IEEE, Tien-Ju Yang, Student Member, IEEE, Joel Emer, Fellow, IEEE
-
PROBABILISTIC AND BOTTLE-NECK FEATURES FOR LVCSR OF MEETINGS Frantisek ˇ Grezl, ´ Martin Karafiat, ´ Stanislav Kontar´ and Jan Cernoc ˇ ky´ Speech@FIT group, Brno University of Technology, Czech Republic
-
“Improved Bottleneck Features Using Pretrained Deep Neural Networks” 对Bottleneck Building Block作了简单的描述:“Bottleneck features are generated from a multi-layer perceptron in which one of the internal layers has a small number of hidden units, relative to the size of the other layers.”
转载:https://blog.youkuaiyun.com/u011501388/article/details/80389164