I\'ve been profiling an application all day long and, having optimized a couple bits of code, I\'m left with this on my todo list. It\'s the activation function for a neural
I've seen that a lot of people around here are trying to use approximation to make Sigmoid faster. However, it is important to know that Sigmoid can also be expressed using tanh, not only exp. Calculating Sigmoid this way is around 5 times faster than with exponential, and by using this method you are not approximating anything, thus the original behaviour of Sigmoid is kept as-is.
public static double Sigmoid(double value)
{
return 0.5d + 0.5d * Math.Tanh(value/2);
}
Of course, parellization would be the next step to performance improvement, but as far as the raw calculation is concerned, using Math.Tanh is faster than Math.Exp.