What is the connections between two stacked LSTM layers?

回眸只為那壹抹淺笑 提交于 2020-06-01 05:12:17

问题


The question is like this one What's the input of each LSTM layer in a stacked LSTM network?, but more into implementing details.

For simplicity how about 4 units and 2 units structures like the following

model.add(LSTM(4, input_shape=input_shape,  return_sequences=True))
model.add(LSTM(2,input_shape=input_shape))

So I know the output of LSTM_1 is 4 length but how do the next 2 units handle these 4 inputs, are they fully connected to the next layer of nodes?

I guess they are fully connected but not sure like the following figure, it was not stated in the Keras document

Thanks!


回答1:


It's not length 4, it's 4 "features".

The length is in the input shape and it never changes, there is absolutely no difference between what happens when you give a regular input to one LSTM and what happens when you give an output of an LSTM to another LSTM.

You can just look at the model's summary to see the shapes and understand what is going on. You never change the length using LSTMs.

They don't communicate at all. Each one takes the length dimension, processes it recurrently, independently from the other. When one finishes and outputs a tensor, the next one gets the tensor and process it alone following the same rules.



来源:https://stackoverflow.com/questions/61844967/what-is-the-connections-between-two-stacked-lstm-layers

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!