Keras Embedding ,where is the “weights” argument?

感情迁移 提交于 2021-02-18 22:33:28

问题


I have seen such kind of code as follow:

embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
                        , trainable=False, mask_zero=True)

When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]

I didnt see weights argument,

keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)

So I am confused,why we can use the argument weights which was not defined the in Keras document?

My keras version is 2.1.5. Hope someone can help me.


回答1:


Keras' Embedding layer subclasses the Layer class (every Keras layer does this). The weights attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights argument. This is also why you won't find it back in the documentation or the implementation of the Embedding layer itself.

You can check the base layer implementation here (Ctrl + F for 'weight').



来源:https://stackoverflow.com/questions/53627251/keras-embedding-where-is-the-weights-argument

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!