how could i get both the final hidden state and sequence in a LSTM layer when using a bidirectional wrapper

后端 未结 1 547
野的像风
野的像风 2021-01-04 13:23

i have followed the steps in https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/ But when it comes to the Bidirectional lstm, i tried t

相关标签:
1条回答
  • 2021-01-04 13:40

    The call Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input) returns 5 tensors:

    1. The entire sequence of hidden states, by default it'll be the concatenation of forward and backward states.
    2. The last hidden state h for the forward LSTM
    3. The last cell state c for the forward LSTM
    4. The last hidden state h for the backward LSTM
    5. The last cell state c for the backward LSTM

    The line you've posted would raise an error since you want to unpack the returned value into just three variables (lstm, state_h, state_c).

    To correct it, simply unpack the returned value into 5 variables. If you want to merge the states, you can concatenate the forward and backward states with Concatenate layers.

    lstm, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
    state_h = Concatenate()([forward_h, backward_h])
    state_c = Concatenate()([forward_c, backward_c])
    
    0 讨论(0)
提交回复
热议问题