I\'m using Keras for the rest of my project, but also hoping to make use of the Bahdanau attention module that Tensorflow has implemented (see tf.contrib.seq2seq.BahdanauAtt
The newer version of Keras uses tf.keras.layers.AdditiveAttention(). This should work off the shelf.
Alternatively a custom Bahdanau layer can be written as shown in half a dozen lines of code: Custom Attention Layer using in Keras