i have followed the steps in https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/ But when it comes to the Bidirectional lstm, i tried t
The call Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
returns 5 tensors:
h
for the forward LSTMc
for the forward LSTMh
for the backward LSTMc
for the backward LSTMThe line you've posted would raise an error since you want to unpack the returned value into just three variables (lstm, state_h, state_c
).
To correct it, simply unpack the returned value into 5 variables. If you want to merge the states, you can concatenate the forward and backward states with Concatenate
layers.
lstm, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])