I don't think the problem is teaching. Using a modern graphical debugger is not rocket science (at least not for most user-mode programs running on a single computer). The problem is with the attitudes of some people. In order to use a debugger effectively, you should:
- Admit it's your fault and select isn't broken.
- Have the perseverance to spend a couple nights debugging, without forgetting the previous point.
- There's no specific algorithm to follow. You should guess educatedly and reason effectively from what you see.
Not many non-programmers have these attitudes. At college, I have seen many friends who give up after a relatively short period of time and bring me some code and tell me the computer is doing something wrong. I usually tell them I trust their computer more than them (and this hurts some feelings, but that's the way it is).