How the number of parameters associated with BatchNormalization layer is 2048?

后端 未结 2 976
孤城傲影
孤城傲影 2021-01-31 18:48

I have the following code.

x = keras.layers.Input(batch_shape = (None, 4096))
hidden = keras.layers.Dense(512, activation = \'relu\')(x)
hidden = keras.layers.Ba         


        
2条回答
  •  独厮守ぢ
    2021-01-31 19:08

    The batch normalization in Keras implements this paper.

    As you can read there, in order to make the batch normalization work during training, they need to keep track of the distributions of each normalized dimensions. To do so, since you are in mode=0by default, they compute 4 parameters per feature on the previous layer. Those parameters are making sure that you properly propagate and backpropagate the information.

    So 4*512 = 2048, this should answer your question.

提交回复
热议问题