How to import word2vec into TensorFlow Seq2Seq model?

后端 未结 2 767
太阳男子
太阳男子 2021-02-11 05:23

I am playing with Tensorflow sequence to sequence translation model. I was wondering if I could import my own word2vec into this model? Rather than using its original \'dense re

2条回答
  •  醉话见心
    2021-02-11 06:21

    I guess with the scope style, which Matthew mentioned, you can get variable:

     with tf.variable_scope("embedding_attention_seq2seq"):
            with tf.variable_scope("RNN"):
                with tf.variable_scope("EmbeddingWrapper", reuse=True):
                      embedding = vs.get_variable("embedding", [shape], [trainable=])
    

    Also, I would imagine you would want to inject embeddings into the decoder as well, the key (or scope) for it would be somthing like:

    "embedding_attention_seq2seq/embedding_attention_decoder/embedding"


    Thanks for your answer, Lukasz!

    I was wondering, what exactly in the code snippet model.vocab[word] stands for? Just the position of the word in the vocabulary?

    In this case wouldn't that be faster to iterate through the vocabulary and inject w2v vectors for the words that exist in w2v model.

提交回复
热议问题