问题
I am trying to write a complicated computational graph using tensorflow and compute symbolic gradients with respect to function parameters. However I am struggling with this when my function/graph involves gather operations of some of the parameters. The problem is that gradient returned by the Session.run is not just a tensor but the IndexedSlices object. And I don't know how to properly convert it to a tensor.
Here is a toy example that illustrates the issue
import tensorflow as tf
import numpy as np
from tensorflow.python.ops import gradients_impl as GI
T_W = tf.placeholder(tf.float32, [2], 'W') # parameter vector
T_data = tf.placeholder(tf.float32, [10], 'data') # data vector
T_Di = tf.placeholder(tf.int32, [10], 'Di') # indices vector
T_pred = tf.gather(T_W,T_Di)
T_loss = tf.reduce_sum(tf.square(T_data-T_pred)) # loss function
T_grad = tf.gradients(T_loss,[T_W])
#T_grad=GI._IndexedSlicesToTensor(T_grad)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
feed_dict={T_W: [1.,2.],
T_data: np.arange(10)**2,
T_Di: np.arange(10)%2}
dl, dgrad = sess.run(
[T_loss, T_grad], feed_dict=feed_dict)
grad = np.array(dgrad)
print (grad)
Which outputs
[[array([ 4., 4., -4., -12., -28., -44., -68., -92., -124.,
-156.], dtype=float32)
array([0, 1, 0, 1, 0, 1, 0, 1, 0, 1], dtype=int32)
array([2], dtype=int32)]]
Here, instead of having a gradient which should been a a vector of two elements I get this indexedSlices object.
I see that the internal module tensorflow.python.ops.gradients_impl has some kind of internal converter _indexedSlicesToTensor, but I find it weird that there is no 'official' way to get the gradient as a tensor. In theano, there was no such issue for example.
回答1:
The answer was very simple. I just needed to use the tf.convert_to_tensor() function
T_grad = tf.gradients(T_loss,[T_W])
T_grad = tf.convert_to_tensor(T_grad[0])
来源:https://stackoverflow.com/questions/50301565/function-gradients-with-gather-operations-in-tensorflow