Add attention layer to Seq2Seq model
问题 I have build a Seq2Seq model of encoder-decoder. I want to add an attention layer to it. I tried adding attention layer through this but it didn't help. Here is my initial code without attention # Encoder encoder_inputs = Input(shape=(None,)) enc_emb = Embedding(num_encoder_tokens, latent_dim, mask_zero = True)(encoder_inputs) encoder_lstm = LSTM(latent_dim, return_state=True) encoder_outputs, state_h, state_c = encoder_lstm(enc_emb) # We discard `encoder_outputs` and only keep the states.