Parsing, although heavily studied, is the least important part of compiling. (Exception: you're designing your own concrete syntax and you are continually refining and changing the language.)
Yacc, Bison, and friends were designed for an era of machines with 64K of memory. They're great for running fast on machines with limited memory. But the amount of human engineering required to force a grammar into LALR(1) form is ridiculous today. Ira Baxter is right that GLR is probably the best, most flexible parsing technology, but PEG (Parsing Expression Grammars) are also good. In both cases the human engineering is light-years ahead of the older tools.
Having dismissed parsing, I will now start another technology food fight :-)
Compiling mostly consists of rewriting a program over and over from one form into another, until eventually you reach assembly code or machine code. For this kind of problem you don't really want to use C or C++:
Q: (Asked of Dave Hanson when he published his amazing book on lcc with Chris Fraser) "You and Chris have spent ten years building what may be one of the most carefully engineered compilers ever made. What did you learn from the experience?"
A: "Well, C is a lousy language to write a compiler in."
I urge you to try one of the popular functional languages, like Haskell or Standard ML. People who work in this field widely believe that compilers are the "killer app" for functional languages. Algebraic data types and pattern matching are tailor-made for writing abstract syntax into intermediate code into machine code. A good place to see the power of these techniques is Andrew Appel's book Compiling With Continuations. (Appel's compiler textbook is also a good read and a very elegant design, but he doesn't always explain why the design is the way it is.)