问题
The active function in my CNN has the form:
abs(X)< tou f = 1.716tanh(0.667x)
x >= tou f = 1.716[tanh(2tou/3)+tanh'(2tou/3)(x-tou)]
x <= -tou f = 1.716[tanh(-2tou/3)+tanh'(-2tou/3)(x+tou)]
tou
is a constant.
So, in TensorFlow it is possible to make your own activation function. I don't want to write it in C++ and recompile the whole of TensorFlow.
How can I use the function available in TensorFlow to achieve it?
回答1:
In tensorflow it is easy to write your own activation function if it's include already existed ops, for your case you can use tf.case
f = tf.case({tf.less(tf.abs(x), tou): lambda: 7.716 * tf.tanh(0.667 * x),
tf.greater_equal(x, tou): lambda: 1.716 * tf.tanh(2 * tou / 3) + 1.716 * tf.tanh(2 * tou / 3) * (x - tou)},
default=lambda: 1.716 * tf.tanh(-2 * tou / 3) + 1.716 * tf.tanh(-2 * tou / 3) * (x + tou), exclusive=True)
来源:https://stackoverflow.com/questions/45769719/how-to-make-a-piecewise-activation-function-with-python-in-tensorflow