At 06:42 PM 9/30/2002 -0700, Eric Hattemer wrote:
A lot of other universities have taken Java and objects as the greatest
new thing ever.  They preach these, and consider C to be old and
obsolete.


This probably has a lot to do with the problem - java is used almost exclusively outside of the OS courses at UH.

OOP and procedural are different paradigms, and when you're just getting started it's confusing to have to switch. Also, UH is typical of CS programs in that it really is not set up to teach you practical skills but to make sure you're familiar with a lot of theories and concepts. It's hard to learn to program in a university, at least not in class, at least not the sort of programming that people will offer to pay you money to do. It's more likely that you'll learn on the job. Or you could take some summer time (or do an independent study?) working on a real open source project. Though even that is fairly limited if you don't get much F2F with other programmers. But at least you'll be working on code that is not all yours.

To be fair, I don't think anyone is saying C is completely obsolete, are they? I'd say they just want to foil the imperialistic tendencies of a language that was designed for and is excellent for OS development, but has no particular business being the standard implementation language for application development. Or are they really suggesting that C++ should be used for OS development too? Are they going to write java VMs in java and cross-compile them somehow? Or beg the hardware designers to put a VM on a chip for us?

Didactic Dave

Reply via email to