I was just thinking, how do machines interpreter binary code? All I understand is your code get\'s turned into 1 and 0\'s so the machine can understand them, but how do they do
It's a huge subject what you are asking. I would recommend the excellent book The elements of computing systems for an overview on how computers and compilers are constructed in principle. It's pretty easy to follow and the exercises are fun to do. Most of it is available online at the link provided.
This question also has a few good links on the subject.