Multiple embedding layers in keras

有些话、适合烂在心里 提交于 2019-12-11 05:14:02

问题


With pretrained embeddings, we can specify them as weights in keras' embedding layer. To use multiple embeddings, would specifying multiple embedding layer be suitable? i.e.

embedding_layer1 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_1],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 embedding_layer2 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_2],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 model.add(embedding_layer1)
 model.add(embedding_layer2)

This suggests to sum them up and represent them into a single layer, which is not what I am after.


回答1:


Here is an example of using multiple embedding layers through multiple inputs by leveraging Keras' functional API. This is for a Kaggle competition so you'll have to read through the code. They feed the network a dictionary with a key for each data input. It's quite clever and I was able to build a separate model using this framework that performed well.

deep-learning-support-9663



来源:https://stackoverflow.com/questions/49805424/multiple-embedding-layers-in-keras

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!