如何理解归一化(Normalization)对于神经网络(深度学习)的帮助?
链接:https://www.zhihu.com/question/326034346/answer/730051338
来源:知乎
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。
回顾一下围绕normalization的一些工作(由最新到最旧的BatchNorm):
2019,Weight Standardization(没有发表,但是有大佬Alan Yuille加持)
Weight Standardization 2019
WS叫权重标准化,建立在BN可以平滑损失landscape/BN可以平滑激活值这个观点上,进一步提升GN的效果匹配到BN的水平上,针对GN在micro-batch训练时性能不及BN。WS的原理是:减小损失和梯度的Lipschitz常数。
2019,Dynamic Normalization
Differentiable Dynamic Normalization for Learning Deep Representation ICML 2019
跟SN类似,加入了GN。
2019,Switchable Normalization
Differentiable Learning-to-Normalize via Switchable Normalization ICLR 2019
SN是为每一层选择/学习适当的归一化层(IN、LN和BN),在ImageNet,COCO,CityScapes,ADE20K和Kinetics等数据集上进行实验,应用涵盖图像分类、物体检测、语义分割和视频分类。
2019,Iterative Normalization(CVPR)
Iterative Normalization Beyond Standardization towards Efficient Whitening CVPR 2019
DBN的高效版本
2019,Spatially-Adaptive Normalization(CVPR)
Semantic Image Synthesis with Spatially-Adaptive Normalization CVPR 2019
用于图像生成
2018,Gradient Normalization(ICML)
GradNorm Gradient Normalization for Adaptive Loss Balancing in Deep Multitask Networks ICML 2018
2018,Kalman Normalization
Kalman Normalization Normalizing Internal Representations Across Network Layers NIPS 2018
2018,Decorrelated Batch Normalization
Decorrelated Batch Normalization CVPR 2018
BN+白化
2018,Spectral Normalization(ICLR)
Spectral Normalization for Generative Adversarial Networks ICLR 2018
2018,Group Normalization(ECCV)
Group Normalization ECCV 2018
用于物体检测和语义分割等batch size很小的时候
GroupNorm是InstanceNorm的变体。
2018,Batch-Instance Normalization
Batch-Instance Normalization for Adaptively Style-Invariant Neural Networks NIPS 2018
2018,Instance-Batch Normalization
Two at Once Enhancing Learning and Generalization Capacities via IBN-Net ECCV 2018
2016,Layer Normalization(没有发表)
用于RNN
2016,Instance Normalization(没有发表,但是经过了实践检验)
用于风格迁移
2016,Weight Normalization(NIPS)
2015,Batch Normalization(ICML)
用于卷积网络ConvNet和图像分类
来源:oschina
链接:https://my.oschina.net/u/4290613/blog/3396356