Wagering a hypothesis based on my experience as a TA and aspiring CS prof:
It would actually confuse the kids who have little to no programming experience more than it would help.
Now first of all: I completely agree that teaching how to use a debugger would be a great thing, but I think the barrier to doing so stems from the greater, systematic problem that software engineering and computer science are not separate majors. Most CS programs will require 2-4 classes where learning to code is the focus. After these, coding ability is required but not the topic of the class.
To my main point: It's very hard to teach something using the guise of "you don't get this now but do it because it'll be useful later." You can try, but I don't think it really works. This as an extension of the idea that people only really learn from doing. Going through the motions but not understanding why is not the same as doing.
I think kids learning to code for the first time aren't going to understand why using a debugger is more effective than inserting print lines. Think about a small to medium sized script you code: would you use the debugger barring odd behavior or some bug you couldn't work out quickly? I wouldn't, seems like it would just slow me down. But, when it comes to maintaining the huge project I work on every day, the debugger is invaluable beyond a doubt. By the time students get to the portion of the curriculum that requires big projects, they're not in a class that focuses on general coding anymore.
And all of this brings me to my awesome idea I think every CS prof should do when teaching how to code: instead of exclusively asking for projects from the kids, every now and then give them a big piece of complex code and ask them to fix the bugs. This would teach them how to use a debugger