Why is a bias neuron necessary for a backpropagating neural network that recognizes the XOR operator?

此生再无相见时 提交于 2019-11-30 20:45:49
Kiril

It's possible to create a neural network without a bias neuron... it would work just fine, but for more information I would recommend you see the answers to this question:

Role of Bias in Neural Networks

Update: the role of the bias neuron in the neural net that attempts to solve model XOR is to minimize the size of the neural net. Usually, for "primitive" (not sure if this is the correct term) logic functions such as AND, OR, NAND, etc, you are trying to create a neural network with 2 input neurons, 2 hidden neurons and 1 output neuron. This can't be done for XOR because the simplest way you can model an XOR is with two NANDs:

You can consider A and B as your input neurons, the gate in the middle is your "bias" neuron, the two gates following are your "hidden" neurons and finally you have the output neuron. You can solve XOR without having a bias neuron, but it would require that you increase the number of hidden neurons to a minimum of 3 hidden neurons. In this case, the 3rd neuron essentially acts as a bias neuron. Here is another question that discusses the bias neuron with regards to XOR: XOR problem solvable with 2x2x1 neural network without bias?

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!