Why is a bias neuron necessary for a backpropagating neural network that recognizes the XOR operator?

后端 未结 1 1685
北海茫月
北海茫月 2021-01-05 16:54

I posted a question yesterday regarding issues that I was having with my backpropagating neural network for the XOR operator. I did a little more work and realized that it m

相关标签:
1条回答
  • 2021-01-05 17:25

    It's possible to create a neural network without a bias neuron... it would work just fine, but for more information I would recommend you see the answers to this question:

    Role of Bias in Neural Networks

    Update: the role of the bias neuron in the neural net that attempts to solve model XOR is to minimize the size of the neural net. Usually, for "primitive" (not sure if this is the correct term) logic functions such as AND, OR, NAND, etc, you are trying to create a neural network with 2 input neurons, 2 hidden neurons and 1 output neuron. This can't be done for XOR because the simplest way you can model an XOR is with two NANDs:

    enter image description here

    You can consider A and B as your input neurons, the gate in the middle is your "bias" neuron, the two gates following are your "hidden" neurons and finally you have the output neuron. You can solve XOR without having a bias neuron, but it would require that you increase the number of hidden neurons to a minimum of 3 hidden neurons. In this case, the 3rd neuron essentially acts as a bias neuron. Here is another question that discusses the bias neuron with regards to XOR: XOR problem solvable with 2x2x1 neural network without bias?

    0 讨论(0)
提交回复
热议问题