gdb being a utility, I believe the student is generally expected to learn it on their own if they like. Much like version control and other things that make the life of a coder easier, but aren't actually part of "Computer Science".
Yes but I don't even remember being told it existed. My intro to C professor made sure we knew that Vi and Emacs existed and gave us enough information to be interested in learning more; why not gdb? My guess is he probably didn't use it.
Yeah we didn't touch any debuggers, I think to an extent they were more just worried about people learning some language basics. The downside of having a first year c/c++ course that didn't just have comp sci students.
The other thing our uni was bad for is that they encouraged us to use vim through the uni servers early on but never have us a proper overview or the resources to really learn it. As a result most saw it as a handicapped text editor that just made things harder.
A course learning C is not a course in debugging or learning any particular debugger. When you are just learning the language, printf is probably the best way to do it.
On the other hand, a follow-up course on actual debugging is probably warranted. Debugging might not be part of the Science bit of Computer Science but it is on the practical end of the Computer bit which you'll probably run into doing the Science bit.
Our version of the course was adapted from the CMU course: http://csapp.cs.cmu.edu/ I thought it was an excellent course. It taught students both concepts and skills that I had picked up in a much more ad-hoc manner.