LSTM+Attention in Keras Error when Initialising states to zeros

后端 未结 0 1501
梦如初夏
梦如初夏 2021-01-31 21:17

I am trying to implement this paper specifically, the Encoder with input attention section. In essence, this is manipulating the input sequence with attention b

相关标签:
回答
  • 消灭零回复
提交回复
热议问题