We don't want to teach "tools" (e.g. IDE) because they are always changing. There is no point is knowing how to set up an IDE when the IDE will have a different GUI before the student graduates. We teach"unix tools" a bit more because they did not change much during the last decades.
My goal is to teach "concepts" (which is hard) and use tools as examples of these concepts. Dealing with the specifics of a particular IDE or tool is pointless. We are trying to give students general skills that will be useful for the whole life of the students, and not skills that the industry needs this year.
That said, I do my best to teach debugging (mainly using gdb and valgrind). The real issue with debugging is that it relies on a lot of experience, which students do not have, by definition.
I learned absolutely nothing from having to use horrid unix tools like gdb in college. The focus on unix tools, in my opinion, seriously degrades the ability of students to learn and unfortunately cements very very bad usability paradigms in their minds.
I learned a ton by debugging things on my Mac using Turbo Pascal, Think C, and even, when absolutely necessary, MacsBug. Those are all obsolete today, but it was well worth learning the skills regardless of the tool.
That said, we were also taught very little about debugging in class. I learned it all on my own (with help from friends). Most of my fellow graduates seriously didn't know how to set a breakpoint and step through code at the end of their college days.
I've heard versions of what you've said hundreds of times and honestly, you academics are missing the point!
Yes, the particular tool you learn will become obsolete, but debugging between Eclipse, gdb, Visual Studio etc. is all basically the same. The knowledge you get is transferrable, just like they have a head start if you teach them C++ and they end up in a Java shop.
Teach them tools that are going to be obsolete! Stop worrying about that.
You have no idea how much time your students are wasting hitting their heads against walls that shouldn't exist trying to debug their crappy code instead of actually working on understanding the underlying concepts. I'd say it's a 9 to 10 ratio of brain-numbing debugging to actual concept implementation.
I really strongly encourage you to set up your students with IDEs. Have a few seminars on how to set up a Visual Studio environment, and how to debug a large codebase. It abstracts a lot of the fluff that stands between them and the underlying concepts. The command line and its tools are great in the right context, but you're making the barrier to entry a lot larger than it needs to be. You can transition to command line tools in more advanced courses, but don't hamper their learning from the get-go.
> I really strongly encourage you to set up your students with IDEs. Have a few seminars on how to set up a Visual Studio environment, and how to debug a large codebase.
You can find lots of information about that using a search engine of your choice. Any student should be able to use Google. So there's no reason to waste teaching ressources for that purpose.
For really big programs you can't use valgrind; valgrind causes significant performance hit and the system often starts to choke under the the memory load. Also GDB really alters the timing of things, so it is bad at timing/multithreading bugs.
Mostly what helps is working through the logs & traces / following through the code along as you pass through the traces; Use a needle and a steady hand ;-)
i guess that's what they should practice in school as part of learning the trade. On the other hand its a trade off - school needs to teach the general principles of programming, with regards to debugging it is hard to distill universally applicable principles; it all depends on what you are doing.
Here is my take:
- unit testing is cool, it allows to catch problems in isolation; so it would be useful to teach this as a discipline
- for most GUI programs and application logic the debugger is really helpful
- traces and log analysis as mentioned earlier as the last resort
Also they should teach how to formulate the problem, analyse problem source and ways of fixing a defect; here you have to understand the trade offs: when to choose a local fix, when is a local fix no longer appropriate; how to quickly test a fix - so that it does not break the system in any way.
If i remember correctly then some of that they do teach in software engineering courses.
Valgrind is fine for teaching students. I would probably use something faster and more sophisticated (and probably more expensive) in a professional setting but that doesn't mean there's nothing to learn from using it.
Adding logging also changes timing. Those aren't just NOPs you're throwing in. Also, changing timing isn't always a bad thing. Sometimes you can get a bug from a race condition to happen more often with the debugger.
Valgrind can't be fast, it has to track/color each memory location, it also has to check each pointer reference; also its memory overhead for larger footprints can be considerable.
Once upon a time it was really slow, but then they added just in time compilation.
The tools for debugging have changed yes. But the high-level process is the same whether it is fixing a NAND gate, figuring out the cause of a segfault, or finding a the location of the leak in a water pipe.
There is a problem. Many of the interns and recent CS grads I have worked with or hired have struggled with debugging. On a day to day basis, debugging is one of the most important skills. I would argue it is also an important skill in research settings, not just industry.
I don't know the answer, but maybe somehow integrating debugging into every CS class over the four years would help. Because you are right, it also requires a lot of practice/experience to get good at it so it is unlikely that just adding a class would solve the problem.
That is what the Purdue psychology program (I have a minor) does in its undergraduate program. The first 2-4 weeks of every class is spent on research methods often specific to the particular topic of the class. After taking the 4-5 classes to complete the minor, how to reduce bias in surveys, usability tests, etc was beaten into my brain.
Write lots of code that uses the most sketchy language features available. The chance is high that soon you'll get code that doesn't work. Now go through each line (source code or assembly) using a debugger of your choice and get it to work again.
Do this until there is nothing that will scare you anymore.
Also, do lots of maintenance programming. Finding bugs in code someone else wrote is generally much more difficult (and in my opinion valuable in learning sense), because you aren't familiar with it.
We aren't asking for vocational training, but for teachers that teach how to think, how to design, how to run experiments, and so on. So many schools (even the 'top' ones) turn out people that cannot do this, and I find it appalling. These things are foundational, they are not vocational. Unfortunately that does in fact meaning having to learn some tools which will become obsolete. So what? I did EE labs, and while resistors are the same none of the digital components really are. I learned how to design NPN junctions, which are now largely replaced with CMOS, but so what? It was the experience that mattered. You can't hand wave that away ('you' is global, not you mandor), teach the math of junction biases, and expect that anyone will be able to do anything with it once they graduate. But so many CS programs try to do exactly that. I walked out of undergrad knowing how to do recurrence relations, prove the complexity of graph algorithms, and so on, but with almost no clue about how to actually design an algorithm, how to structure and design software, and so on. It took grad school and some good teaching professors, to change that.
I've rewritten this three times or so - it is hard to address without sounding like I'm attacking you, which I am not. But I am truly dismayed about the skill sets of people graduating from CS programs. I'm not asking for Java vocational training, but for a recognition that 99% of the graduates will be asked to function as "engineers", not "scientists". The people that come out and that function well seem to do so despite the schooling, not because of it.
Incidentally, the best EE teacher, or any teacher, that I ever had, had worked in industry for many years, decided he wanted to teach, and went into teaching. His classes were practical and pragmatic. Oh, you were failing if you didn't master the math and theory behind the material, but he taught you how to design, how to think, how to manipulate all of this book learned stuff to make real things that worked. We had to cost out our projects, write design reports, and so on. Extremely hard courses, but absolutely fantastic, because of, not despite, the focus on what might be called 'pointless' things (who cares what the cost of a transistor was in 1986, after all?!)
Unfortunately, you cannot gain experience, and develop the process, without using tools. It needs to be part of the education. Just look at the post by the poor person you are responding to. Think of how much better his entire education would have been if in some freshman level lab he had been taught some of these basics. I shudder to think how much he probably spent for that education vs what he got.
We don't want to teach "tools" (e.g. IDE) because they are always changing. There is no point is knowing how to set up an IDE when the IDE will have a different GUI before the student graduates. We teach"unix tools" a bit more because they did not change much during the last decades.
My goal is to teach "concepts" (which is hard) and use tools as examples of these concepts. Dealing with the specifics of a particular IDE or tool is pointless. We are trying to give students general skills that will be useful for the whole life of the students, and not skills that the industry needs this year.
That said, I do my best to teach debugging (mainly using gdb and valgrind). The real issue with debugging is that it relies on a lot of experience, which students do not have, by definition.