How the number of parameters associated with BatchNormalization layer is 2048?

后端 未结 2 972
孤城傲影
孤城傲影 2021-01-31 18:48

I have the following code.

x = keras.layers.Input(batch_shape = (None, 4096))
hidden = keras.layers.Dense(512, activation = \'relu\')(x)
hidden = keras.layers.Ba         


        
2条回答
  •  后悔当初
    2021-01-31 19:35

    These 2048 parameters are in fact [gamma weights, beta weights, moving_mean(non-trainable), moving_variance(non-trainable)], each having 512 elements (the size of the input layer).

提交回复
热议问题