TensorBoard Embedding Example?

后端 未结 7 831
谎友^
谎友^ 2020-12-24 02:55

I\'m looking for a tensorboard embedding example, with iris data for example like the embedding projector http://projector.tensorflow.org/

But unfortunately i couldn

相关标签:
7条回答
  • 2020-12-24 03:44

    I've used FastText's pre-trained word vectors with TensorBoard.

    import os
    import tensorflow as tf
    import numpy as np
    import fasttext
    from tensorflow.contrib.tensorboard.plugins import projector
    
    # load model
    word2vec = fasttext.load_model('wiki.en.bin')
    
    # create a list of vectors
    embedding = np.empty((len(word2vec.words), word2vec.dim), dtype=np.float32)
    for i, word in enumerate(word2vec.words):
        embedding[i] = word2vec[word]
    
    # setup a TensorFlow session
    tf.reset_default_graph()
    sess = tf.InteractiveSession()
    X = tf.Variable([0.0], name='embedding')
    place = tf.placeholder(tf.float32, shape=embedding.shape)
    set_x = tf.assign(X, place, validate_shape=False)
    sess.run(tf.global_variables_initializer())
    sess.run(set_x, feed_dict={place: embedding})
    
    # write labels
    with open('log/metadata.tsv', 'w') as f:
        for word in word2vec.words:
            f.write(word + '\n')
    
    # create a TensorFlow summary writer
    summary_writer = tf.summary.FileWriter('log', sess.graph)
    config = projector.ProjectorConfig()
    embedding_conf = config.embeddings.add()
    embedding_conf.tensor_name = 'embedding:0'
    embedding_conf.metadata_path = os.path.join('log', 'metadata.tsv')
    projector.visualize_embeddings(summary_writer, config)
    
    # save the model
    saver = tf.train.Saver()
    saver.save(sess, os.path.join('log', "model.ckpt"))
    

    Then run this command in your terminal:

    tensorboard --logdir=log
    
    0 讨论(0)
提交回复
热议问题