Friday, October 07, 2005

Did H.G. Wells predict 21st Century Computer Science Students?

Back in 1895 H. G. Wells wrote The Time Machine, a story about a man who invents a time machine, travels far into the future and the experiences he has. There've been two movies: the 1960 George Pal version and the more recent 2002 version. Despite the fact that Guy Pearce is a relative (on my wife's side), I prefer the original with Rod Taylor; and any director who can bring something like The 7 Faces of Dr. Lao to the screen always has an edge anyway!

But I digress. When the Time Traveller reaches 802701 A.D. he finds that the human race has fragmented into two species: the Eloi and the Morlocks. They are markedly different in appearance, with the Eloi as smaller, more beautiful humans who live above ground, and the Morlocks as grotesque creatures who live in the perpetual darkness below ground. It turns out that the Eloi have evolved past the point where they could understand or use the advanced machinery of their ancestors and rely on the Morlocks to provide them with food; the Morlocks maintain much of the machinery which helps both species and eat the Eloi. In essence, the Eloi spend all their time playing and appreciating nature and life without knowing why things work the way they do, whereas the Morlocks spend all their time ensuring things work and can't appreciate anything else.

Over the years, the School of Computing here at the University of Newcastle has had a world class reputation for both its undergraduate and postgraduate degrees. But one of the things I've noticed is a gradual change in the subjects that undergraduates learn from when I did my degree back in the mid 1980s. Back then, we learnt software languages such as Pascal, C, 6502 assembler, 68000 assembler, did hardware design (e.g., VLSI) as well as operating systems (using Concurrent Euclid strangely enough) and network programming. For a software course, it covered a lot of depth and breadth: we were taught why things work as well as how they work.

These days, with the advent of languages such as Java, GUIs and even the Web, students are taught at a much higher level, with little or no experience of hardware or operating system principles /architecture. (Note, I've reason to believe that this is not purely a local phenomenon.) That's because industry needs a new set of skills. However, if you ever get a chance to talk to successful graduates these days, there's a definite lack of understanding about why things work at any level below the virtual machine. Now I'm not saying that everything I was taught all those years ago is still useful to me today, but it gave me (and others) an appreciation of so many different aspects that it is often surprising when something from left-field will be useful. I realise there's a trade-off to be made between time and subjects (there are a lot more topics today in computer science than there were 20 years ago), but I wonder: are we breeding a race of Eloi?

6 comments:

Thomas Rischbeck said...

hi mark,

good to read your blog again! i've a comment on your latest entry:

i went through the same school as you did it seems. in germany they call this "technical foundations of computer science". we did micro-programming of a CPU -- which is even below assembler.
While i must say that it is satisfying to see how things work down to the electron level, given the huge number of abstraction levels we see nowadays, i do wonder how relevant this is nowadays. (and even at this level we're dealing with abstractions!)

i think we've reached a stage where the low-level stuff has become very stable. because of the huge stack of abstractions its also impracticable for anybody to have the total overview. people are more and more forced to become experts. and eventually, i do think that the people who take decisions and know what's going on sit at the top of this stack.

Anonymous said...

great post. I would just add that newly minuted software folks don't know how the virtual machine works either, it being much closer to computer architecture and machine instructions...

I think Java is great, but I'd be much happier if undergrads spent four years with C++, system calls, and some assembler.

Greg

Mark Little said...

Hi Thomas. To a degree I agree: not everything at the lower levels of CS is important as you progress up the stack (e.g., do I really need to know that an integer is 4 bytes on this architecture if I'm building a GUI?) However, a basic understanding of the *what* and the *why* should form the building blocks for students IMO. Then, depending on where they want to specialise, different areas may be dropped or expanded as the student progresses. From what I've seen and heard, that doesn't happen, which is taking us to the other extreme.

Mark Little said...

Greg, you're right: I forgot about that! Just taking the VM for a second, one of the things that got me thinking about this again the other day was a student talking about throwing more threads at a problem to increase performance. When I asked about context switch times or whether the app. was intended for a multi-processor, it drew a blank stare. Now, that could have been the specific student, but ...

Mark Little said...

BTW, not that I'd advocate this to everyone, but one of the best things I ever did that's stuck with me over the years, was implement a thread package in C back in the early 90's. Having to use setjmp and longjmp really hammers home some of the basics of what's going on these days in threaded environments and always makes me think twice before "throwing threads at the problem" ;-)

Anonymous said...

Mark,

I agree with most of what you are saying and believe part of the problem lies with courses not having a clear direction.

What I mean by this is, many courses (mentioning no names...) are not clear on the type of graduates they are trying to produce. For instance, are they aiming at producing people suitable for research, suitable for large organisations or just with a good grounding in the basics of CS. I don't think that these need be mutually exclsuive but suspect there needs to be an empahsis on one of them.

When I did my undergraduate course it was still pretty technical, low level and theoretical (C++, assembler, algorithms,...) but the course has become higher level since then. This is, at least in part, in response to industry. I know that people from large organisations have claimed that CS graduates do not have the skills that they require. So, as the courses are measured on graduate recruitment, the course is changed to suit the employers. This has pros and cons of course but I feel that it benefits the employers most and the students least. Sure, they learn about ant, how to use an IDE, etc but they have less of a solid grounding in the basics which are more important. It's long term loss for short term gain.