I like this definition:
In recursion, a routine solves a small part of a problem itself, divides the problem into smaller pieces, and then calls itself to solve each of the smaller pieces.
I also like Steve McConnells discussion of recursion in Code Complete where he criticises the examples used in Computer Science books on Recursion.
Don't use recursion for factorials or Fibonacci numbers
One problem with
computer-science textbooks is that
they present silly examples of
recursion. The typical examples are
computing a factorial or computing a
Fibonacci sequence. Recursion is a
powerful tool, and it's really dumb to
use it in either of those cases. If a
programmer who worked for me used
recursion to compute a factorial, I'd
hire someone else.
I thought this was a very interesting point to raise and may be a reason why recursion is often misunderstood.
EDIT:
This was not a dig at Dav's answer - I had not seen that reply when I posted this