How do you apply layer normalization in an RNN using tf.keras?
问题 I would like to apply layer normalization to a recurrent neural network using tf.keras. In TensorFlow 2.0, there is a LayerNormalization class in tf.layers.experimental , but it's unclear how to use it within a recurrent layer like LSTM , at each time step (as it was designed to be used). Should I create a custom cell, or is there a simpler way? For example, applying dropout at each time step is as easy as setting the recurrent_dropout argument when creating an LSTM layer, but there is no