There are more than a few questions here, to my mind. Here's a few that are asked and a few that I'd infer:
I was taught in BASIC and Pascal initially, usually with an interpreter that made it easier to run the program till something blew up. We didn't have breakpoints or many of the fancy things there are now for tracing through code, though this would have been from 1983-1994 using a Commodore 64, Watcom BASIC, and Pascal on a Mac.
Even in my later university years, we didn't have a debugger. If our code didn't work, we had print statements or do manual tracing, in terms of time this would have been 1995-1997.
One cavaet with a debugger is that for something like Visual Studio, do you have any idea how long it could take to go through every feature it has for debugging? That could take years in some cases I think. This is without getting into all the build options and other things that it can do that one might use eventually. Another point is that for all the good things that a debugger gives, there is something to be said for how complex things can get,e.g. using a breakpoint in VS there is the call stack, local variables, watch windows, memory, disassembly and other things that one could want to examine while execution is halted.
The basics of using a debugger could be learned in a week or so, I think. However, to get to the point of mastering what a debugger does, how much goes on when code is executing as well as where it is executing as there are multiple places where things can run these days like GPUs to go along with the CPU, would take a lot longer and I'd question how many people have that kind of drive, even in school.