Issue feeding a list into feed_dict in TensorFlow

后端 未结 3 1180
[愿得一人]
[愿得一人] 2020-12-04 19:54

I\'m trying to pass a list into feed_dict, however I\'m having trouble doing so. Say I have:

inputs = 10 * [tf.placeholder(tf.float32, shape=(ba         


        
相关标签:
3条回答
  • 2020-12-04 20:16

    Here is a correct example:

    batch_size, input_size, n = 2, 3, 2
    # in your case n = 10
    x = tf.placeholder(tf.types.float32, shape=(n, batch_size, input_size))
    y = tf.add(x, x)
    
    data = np.random.rand(n, batch_size, input_size)
    
    sess = tf.Session()
    print sess.run(y, feed_dict={x: data})
    

    And here is a strange things I see in your approach. For some reason you use 10 * [tf.placeholder(...)], which creates 10 tensors of size (batch_size, input_size). No idea why do you do this, if you can just create on Tensor of rank 3 (where the first dimension is 10).

    Because you have a list of tensors (and not a tensor), you can not feed your data to this list (but in my case I can feed to my tensor).

    0 讨论(0)
  • 2020-12-04 20:21

    feed_dict can be provided by preparing a dictionary beforehand as follows

    n = 10
    input_1 = [tf.placeholder(...) for _ in range(n)]
    input_2 = tf.placeholder(...)
    data_1 = [np.array(...) for _ in range(n)]
    data_2 = np.array(...)
    
    
    feed_dictionary = {}
    for i in range(n):
        feed_dictionary[input_1[i]] = data_1[i]
    feed_dictionary[input_2] = data_2
    sess.run(y, feed_dict=feed_dictionary)
    
    0 讨论(0)
  • 2020-12-04 20:25

    There are two issues that are causing problems here:

    The first issue is that the Session.run() call only accepts a small number of types as the keys of the feed_dict. In particular, lists of tensors are not supported as keys, so you have to put each tensor as a separate key.* One convenient way to do this is using a dictionary comprehension:

    inputs = [tf.placeholder(...), ...]
    data = [np.array(...), ...]
    sess.run(y, feed_dict={i: d for i, d in zip(inputs, data)})
    

    The second issue is that the 10 * [tf.placeholder(...)] syntax in Python creates a list with ten elements, where each element is the same tensor object (i.e. has the same name property, the same id property, and is reference-identical if you compare two elements from the list using inputs[i] is inputs[j]). This explains why, when you tried to create a dictionary using the list elements as keys, you ended up with a dictionary with a single element - because all of the list elements were identical.

    To create 10 different placeholder tensors, as you intended, you should instead do the following:

    inputs = [tf.placeholder(tf.float32, shape=(batch_size, input_size))
              for _ in xrange(10)]
    

    If you print the elements of this list, you'll see that each element is a tensor with a different name.


    EDIT: * You can now pass tuples as the keys of a feed_dict, because these may be used as dictionary keys.

    0 讨论(0)
提交回复
热议问题