tf.gradients() return None
Closed this issue · 2 comments
ChengYeung1222 commented
I'm trying to implement integratedGradients on a toy example. The generated data and placeholders are defined as follows:
x_pure = np.random.randint(-10,10,[10, 227, 227, 11])
y_test = np.sum(x_pure**2,axis=(1,2,3))
x = tf.placeholder(tf.float32, name='x_input', shape=[None, 227, 227, 11])
y = tf.placeholder(tf.float32, name='y_input', shape=[None, 1])
The corruption occurs in this function:
res = ig.build_ig(inter, stepsize, pred, num_steps=50)
def build_ig(samples, stepsizes, _output, num_steps=50):
grads = tf.gradients(ys=_output, xs=samples)
In debugger:
_output=Tensor("Sum:0", shape=(?,), dtype=float32)
samples=Tensor("Reshape:0", shape=(?, 227, 227, 11), dtype=float32)
grads=<class 'list'>: [None]
Since the tf.gradients()
function returns None, the next operation cannot be performed. Where should I even start to look for fixing this error?
ChengYeung1222 commented
pred
is computed as pred = tf.reduce_sum(tf.pow(x, 2),[1,2,3])
. I cannot point out which operation isn't differentiable...
ChengYeung1222 commented
The problem solved. See it on StackOverflow.