Keras How to use max_value in Relu activation function

不打扰是莪最后的温柔 提交于 2019-12-05 04:16:03

You can use the ReLU function of the Keras backend. Therefore, first import the backend:

from keras import backend as K

Then, you can pass your own function as activation using backend functionality. This would look like

def relu_advanced(x):
    return K.relu(x, max_value=250)

Then you can use it like

model.add(Dense(512, input_dim=1, activation=relu_advanced))

or

model.add(Activation(relu_advanced))

Unfortunately, you must hard code additional arguments. Therefore, it is better to use a function, that returns your function and passes your custom values:

def create_relu_advanced(max_value=1.):        
    def relu_advanced(x):
        return K.relu(x, max_value=K.cast_to_floatx(max_value))
    return relu_advanced

Then you can pass your arguments by either

model.add(Dense(512, input_dim=1, activation=create_relu_advanced(max_value=250)))

or

model.add(Activation(create_relu_advanced(max_value=250)))
Hongye Yang

This is what I did using Lambda layer to implement clip relu: Step 1: define a function to do reluclip:

def reluclip(x, max_value = 20):
    return K.relu(x, max_value = max_value)

Step 2: add Lambda layer into model: y = Lambda(function = reluclip)(y)

That is as easy as one lambda :

from keras.activations import relu
clipped_relu = lambda x: relu(x, max_value=3.14)

Then use it like this:

model.add(Conv2D(64, (3, 3)))
model.add(Activation(clipped_relu))

When reading a model saved in hdf5 use custom_objects dictionary:

model = load_model(model_file, custom_objects={'<lambda>': clipped_relu})

Tested below, it'd work:

import keras

def clip_relu (x): 
    return keras.activations.relu(x, max_value=1.)

predictions=Dense(num_classes,activation=clip_relu,name='output')
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!