Keras retrieve value of node before activation function

前端 未结 5 1229
感动是毒
感动是毒 2021-02-07 00:48

Imagine a fully-connected neural network with its last two layers of the following structure:

[Dense]
    units = 612
    activation = softplus

[Dense]
    unit         


        
5条回答
  •  失恋的感觉
    2021-02-07 01:16

    Since you're using get_value(), I'll assume that you're using Theano backend. To get the value of the node before the sigmoid activation, you can traverse the computation graph.

    The graph can be traversed starting from outputs (the result of some computation) down to its inputs using the owner field.

    In your case, what you want is the input x of the sigmoid activation op. The output of the sigmoid op is model.output. Putting these together, the variable x is model.output.owner.inputs[0].

    If you print out this value, you'll see Elemwise{add,no_inplace}.0, which is an element-wise addition op. It can be verified from the source code of Dense.call():

    def call(self, inputs):
        output = K.dot(inputs, self.kernel)
        if self.use_bias:
            output = K.bias_add(output, self.bias)
        if self.activation is not None:
            output = self.activation(output)
        return output
    

    The input to the activation function is the output of K.bias_add().

    With a small modification of your code, you can get the value of the node before activation:

    x = model.output.owner.inputs[0]
    func = K.function([model.input] + [K.learning_phase()], [x])
    print func([test_input, 0.])
    

    For anyone using TensorFlow backend: use x = model.output.op.inputs[0] instead.

提交回复
热议问题