Feature Column Pre-trained Embedding

前端 未结 2 647
渐次进展
渐次进展 2021-01-24 14:46

How to use pre-trained embedding with tf.feature_column.embedding_column.

I used pre_trained embedding in tf.feature_column.embedding_co

相关标签:
2条回答
  • 2021-01-24 15:34

    You can also wrap your array into a function like this:

    some_matrix = np.array([[0,1,2],[0,2,3],[5,6,7]])
    
    def custom_init(shape, dtype):
        return some_matrix
    
    embedding_feature = tf.feature_column.embedding_column(itemx_vocab, 
                                                           dimension=3, 
                                                           initializer=custom_init
                                                           )
    
    

    It's an hacky way but does the job.

    0 讨论(0)
  • 2021-01-24 15:35

    I also take a issue here https://github.com/tensorflow/tensorflow/issues/20663

    finally I got a right way with to solve it. although. i'm not clear why answer above is not effective!! if you know the question, Thanks to give some suggestion to me!!

    ok~~~~here is current solvement. Actually from here Feature Columns Embedding lookup

    code:

    itemx_vocab = tf.feature_column.categorical_column_with_vocabulary_file(
        key='itemx',
        vocabulary_file=FLAGS.vocabx)
    
    embedding_initializer_x = tf.contrib.framework.load_embedding_initializer(
        ckpt_path='model.ckpt',
        embedding_tensor_name='w_in',
        new_vocab_size=itemx_vocab.vocabulary_size,
        embedding_dim=emb_size,
        old_vocab_file='FLAGS.vocab_emb',
        new_vocab_file=FLAGS.vocabx
    )
    itemx_emb = tf.feature_column.embedding_column(itemx_vocab,
                                                   dimension=128,
                                                   initializer=embedding_initializer_x,
                                                   trainable=False)
    
    0 讨论(0)
提交回复
热议问题