Wrapping Tensorflow For Use in Keras

前端 未结 1 756
谎友^
谎友^ 2021-01-17 22:03

I\'m using Keras for the rest of my project, but also hoping to make use of the Bahdanau attention module that Tensorflow has implemented (see tf.contrib.seq2seq.BahdanauAtt

相关标签:
1条回答
  • 2021-01-17 22:25

    The newer version of Keras uses tf.keras.layers.AdditiveAttention(). This should work off the shelf.

    Alternatively a custom Bahdanau layer can be written as shown in half a dozen lines of code: Custom Attention Layer using in Keras

    0 讨论(0)
提交回复
热议问题