Accessing gradient values of keras model outputs with respect to inputs

后端 未结 1 889
旧巷少年郎
旧巷少年郎 2020-12-31 17:37

I made a pretty simple NN model to do some non-linear regressions for me in Keras, as an introduction exercise. I uploaded my jupyter notebookit as a gist here (renders prop

1条回答
  •  生来不讨喜
    2020-12-31 18:25

    As you mention, Theano and TF are symbolic, so doing a derivative should be quite easy:

    import theano
    import theano.tensor as T
    import keras.backend as K
    J = T.grad(model.output[0, 0], model.input)
    jacobian = K.function([model.input, K.learning_phase()], [J])
    

    First you compute the symbolic gradient (T.grad) of the output given the input, then you build a function that you can call and does the computation. Note that sometimes this is not that trivial due to shape problems, as you get one derivative for each element in the input.

    0 讨论(0)
提交回复
热议问题