I\'m trying to learn about neural networks and coded a simple, back-propagation, neural network that uses sigmoid activation functions, random weight initialization, and learnin
My answer will be not about ruby, but about neural network. First of all, you have to understand how to write your inputs and your network on a paper. If you implement binary operatos, your space will consist of four points on XY-plane. Mark true and false on X and Y axis and draw your four points. If you to it right, you will receive something like this
Now(maybe you didn't know this interpretattion of neuron) try to draw neuron as a line on a plane, which separates your points as you need. For example, this is the line for AND: The line separates correct answers from incorrect. If you understand, you can write the line for OR. XOR will be a trouble.
And as a last step of this debug, realize a neuron as a line. Find a literature about it, I don't remember how to build neuron by existing line. It will be simple, really. Then build a neuron vector for AND implement it. Realize AND as a single neuron network, where neuron is defined as your AND, calculated on a paper. If you do all correct, your network will do AND function. I wrote such a huge number of letters just because you write a program before understanding a task. I don't want to be rough, but your mention of XOR showed it. If you will try to build XOR on one neuron, you will receive nothing - it's impossible to separate correct answers from incorrect. In books it is called "XOR is not linear separable". So for XOR you need to build a two layers network. For example, you will have AND and not-OR as a first layer and AND as a second layer.
If you still read this and you understand what I wrote, then you will have no troubles with debugging network. If your network fails to learn some function, then build it on a paper, then hardcode your network and test it. If It still fails, you build it on a paper incorrect - re-read my lecture;)