
ECCV-2018
Facebook AI Research
更多论文解读,可参考【Paper Reading】
文章目录
1 Background and Motivation

Batch normalization(BN) 在 batch size 很小的时候,效果下降的比较多,而目标检测或者分割等任务由于输入分辨率比较高,网络偏大时 batch-size 往往比较小,BN 发挥的作用减弱了
作者基于 many classical features like SIFT and HOG are group-wise features and involve group-wise normalization
提出了 Group Normalization,以此来减小小 batch-size 对 normalization 带来的影响
2 Related Work

- Normalization
LRN / BN / LN / IN / WN(weight normalization)
LN 和 IN 属于 GN 的两个极端, effective for training sequential models (RNN/LSTM) or generative models(GAN),but have limited success in visual recognition - Addressing small batches
Batch Renormalization(batch size 过小也不行) - Group-wise computation
AlexNet / ResNeXt / MobileNet / Xception / ShuffleNet
3 Advantages / Contributions
提出 Group Normalization
4 Method
its computation is independent of batch sizes.

LN, IN, and GN all perform independent computations along the batch axis
GN 的两个极端就是 LN 和 IN
看看公式表达,减均值,除以标准差

打一巴掌来个糖,学两个参数弥补回来

i = ( i N , i C , i H , i W ) i = (i_N, i_C,i_H,i_W) i=(i

最低0.47元/天 解锁文章

被折叠的 条评论
为什么被折叠?



