Keras retrieve value of node before activation function

前端 未结 5 1228
感动是毒
感动是毒 2021-02-07 00:48

Imagine a fully-connected neural network with its last two layers of the following structure:

[Dense]
    units = 612
    activation = softplus

[Dense]
    unit         


        
相关标签:
5条回答
  • 2021-02-07 01:16

    Since you're using get_value(), I'll assume that you're using Theano backend. To get the value of the node before the sigmoid activation, you can traverse the computation graph.

    The graph can be traversed starting from outputs (the result of some computation) down to its inputs using the owner field.

    In your case, what you want is the input x of the sigmoid activation op. The output of the sigmoid op is model.output. Putting these together, the variable x is model.output.owner.inputs[0].

    If you print out this value, you'll see Elemwise{add,no_inplace}.0, which is an element-wise addition op. It can be verified from the source code of Dense.call():

    def call(self, inputs):
        output = K.dot(inputs, self.kernel)
        if self.use_bias:
            output = K.bias_add(output, self.bias)
        if self.activation is not None:
            output = self.activation(output)
        return output
    

    The input to the activation function is the output of K.bias_add().

    With a small modification of your code, you can get the value of the node before activation:

    x = model.output.owner.inputs[0]
    func = K.function([model.input] + [K.learning_phase()], [x])
    print func([test_input, 0.])
    

    For anyone using TensorFlow backend: use x = model.output.op.inputs[0] instead.

    0 讨论(0)
  • 2021-02-07 01:19

    So this is for fellow googlers, the working of the keras API has changed significantly since the accepted answer was posted. The working code for extracting a layer's output before activation (for tensorflow backend) is:

    model = Your_Keras_Model()
    the_tensor_you_need = model.output.op.inputs[0] #<- this is indexable, if there are multiple inputs to this node then you can find it with indexing.
    

    In my case, the final layer was a dense layer with activation softmax, so the tensor output I needed was <tf.Tensor 'predictions/BiasAdd:0' shape=(?, 1000) dtype=float32>.

    0 讨论(0)
  • 2021-02-07 01:28

    easy way to define new layer with new activation function:

    def change_layer_activation(layer):
    
        if isinstance(layer, keras.layers.Conv2D):
    
            config = layer.get_config()
            config["activation"] = "linear"
            new = keras.layers.Conv2D.from_config(config)
    
        elif isinstance(layer, keras.layers.Dense):
    
            config = layer.get_config()
            config["activation"] = "linear"
            new = keras.layers.Dense.from_config(config)
    
        weights = [x.numpy() for x in layer.weights]
    
        return new, weights
    
    0 讨论(0)
  • 2021-02-07 01:32

    (TF backend) Solution for Conv layers.

    I had the same question, and to rewrite a model's configuration was not an option. The simple hack would be to perform the call function manually. It gives control over the activation.

    Copy-paste from the Keras source, with self changed to layer. You can do the same with any other layer.

    def conv_no_activation(layer, inputs, activation=False):
    
        if layer.rank == 1:
            outputs = K.conv1d(
                inputs,
                layer.kernel,
                strides=layer.strides[0],
                padding=layer.padding,
                data_format=layer.data_format,
                dilation_rate=layer.dilation_rate[0])
        if layer.rank == 2:
            outputs = K.conv2d(
                inputs,
                layer.kernel,
                strides=layer.strides,
                padding=layer.padding,
                data_format=layer.data_format,
                dilation_rate=layer.dilation_rate)
        if layer.rank == 3:
            outputs = K.conv3d(
                inputs,
                layer.kernel,
                strides=layer.strides,
                padding=layer.padding,
                data_format=layer.data_format,
                dilation_rate=layer.dilation_rate)
    
        if layer.use_bias:
            outputs = K.bias_add(
                outputs,
                layer.bias,
                data_format=layer.data_format)
    
        if activation and layer.activation is not None:
            outputs = layer.activation(outputs)
    
        return outputs
    

    Now we need to modify the main function a little. First, identify the layer by its name. Then retrieve activations from the previous layer. And at last, compute the output from the target layer.

    def get_output_activation_control(model, images, layername, activation=False):
        """Get activations for the input from specified layer"""
    
        inp = model.input
    
        layer_id, layer = [(n, l) for n, l in enumerate(model.layers) if l.name == layername][0]
        prev_layer = model.layers[layer_id - 1]
        conv_out = conv_no_activation(layer, prev_layer.output, activation=activation)
        functor = K.function([inp] + [K.learning_phase()], [conv_out]) 
    
        return functor([images]) 
    

    Here is a tiny test. I'm using VGG16 model.

    a_relu = get_output_activation_control(vgg_model, img, 'block4_conv1', activation=True)[0]
    a_no_relu = get_output_activation_control(vgg_model, img, 'block4_conv1', activation=False)[0]
    
    print(np.sum(a_no_relu < 0))
    > 245293
    

    Set all negatives to zero to compare with the results retrieved after an embedded in VGG16 ReLu operation.

    a_no_relu[a_no_relu < 0] = 0
    print(np.allclose(a_relu, a_no_relu))
    > True
    
    0 讨论(0)
  • 2021-02-07 01:39

    I can see a simple way just changing a little the model structure. (See at the end how to use the existing model and change only the ending).

    The advantages of this method are:

    • You don't have to guess if you're doing the right calculations
    • You don't need to care about the dropout layers and how to implement a dropout calculation
    • This is a pure Keras solution (applies to any backend, either Theano or Tensorflow).

    There are two possible solutions below:

    • Option 1 - Create a new model from start with the proposed structure
    • Option 2 - Reuse an existing model changing only its ending

    Model structure

    You could just have the last dense separated in two layers at the end:

    [Dense]
        units = 612
        activation = softplus
    
    [Dense]
        units = 1
        #no activation
    
    [Activation]
        activation = sigmoid
    

    Then you simply get the output of the last dense layer.

    I'd say you should create two models, one for training, the other for checking this value.

    Option 1 - Building the models from the beginning:

    from keras.models import Model
    
    #build the initial part of the model the same way you would
    #add the Dense layer without an activation:
    
    #if using the functional Model API
        denseOut = Dense(1)(outputFromThePreviousLayer)    
        sigmoidOut = Activation('sigmoid')(denseOut)    
    
    #if using the sequential model - will need the functional API
        model.add(Dense(1))
        sigmoidOut = Activation('sigmoid')(model.output)
    

    Create two models from that, one for training, one for checking the output of dense:

    #if using the functional API
        checkingModel = Model(yourInputs, denseOut)
    
    #if using the sequential model:
        checkingModel = model   
    
    trainingModel = Model(checkingModel.inputs, sigmoidOut)   
    

    Use trianingModel for training normally. The two models share weights, so training one is training the other.

    Use checkingModel just to see the outputs of the Dense layer, using checkingModel.predict(X)

    Option 2 - Building this from an existing model:

    from keras.models import Model
    
    #find the softplus dense layer and get its output:
    softplusOut = oldModel.layers[indexForSoftplusLayer].output
        #or should this be the output from the dropout? Whichever comes immediately after the last Dense(1)
    
    #recreate the dense layer
    outDense = Dense(1, name='newDense', ...)(softPlusOut)
    
    #create the new model
    checkingModel = Model(oldModel.inputs,outDense)
    

    It's important, since you created a new Dense layer, to get the weights from the old one:

    wgts = oldModel.layers[indexForDense].get_weights()
    checkingModel.get_layer('newDense').set_weights(wgts)
    

    In this case, training the old model will not update the last dense layer in the new model, so, let's create a trainingModel:

    outSigmoid = Activation('sigmoid')(checkingModel.output)
    trainingModel = Model(checkingModel.inputs,outSigmoid)
    

    Use checkingModel for checking the values you want with checkingModel.predict(X). And train the trainingModel.

    0 讨论(0)
提交回复
热议问题