Activation function for output layer for regression models in Neural Networks

寵の児 提交于 2019-12-01 03:04:17

If you have, say, a Sigmoid as an activation function in output layer of your NN you will never get any value less than 0 and greater than 1.

Basically if the data your're trying to predict are distributed within that range you might approach with a Sigmoid function and test if your prediction performs well on your training set.

Even more general, when predict a data you should come up with the function that represents your data in the most effective way.

Hence if your real data does not fit Sigmoid function well you have to think of any other function (e.g. some polynomial function, or periodic function or any other or a combination of them) but you also should always care of how easily you will build your cost function and evaluate derivatives.

for linear regression type of problem, you can simply create the Output layer without any activation function as we are interested in numerical values without any transformation.

more info :

https://machinelearningmastery.com/regression-tutorial-keras-deep-learning-library-python/

for classification : You can use sigmoid, tanh, Softmax etc.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!