问题
I have seen such kind of code as follow:
embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
, trainable=False, mask_zero=True)
When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]
I didnt see weights argument,
keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)
So I am confused,why we can use the argument weights which was not defined the in Keras document?
My keras version is 2.1.5. Hope someone can help me.
回答1:
Keras' Embedding
layer subclasses the Layer
class (every Keras layer does this). The weights
attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights
argument. This is also why you won't find it back in the documentation or the implementation of the Embedding
layer itself.
You can check the base layer implementation here (Ctrl + F for 'weight').
来源:https://stackoverflow.com/questions/53627251/keras-embedding-where-is-the-weights-argument