On Aug 30, 2007, at 9:39 PM, Dave McGuire wrote: > On Aug 30, 2007, at 9:26 PM, Timothy Normand Miller wrote: >> I'm no longer obsessed with maximizing performance of the machine. >> Now, I want to maximize my performance as a programmer. > > Be very, very careful with that attitude. Back in the 1970s, some > blithering idiot came up with the idea that programmer time is more > important than processor time.
But often it is. The idea goes back to the very earliest days of computing when, despite the objections of purists like von Neumann, programmers began to automate the job of programming. Remember also that in the early days, many codes were "one shot": run the code, publish the result, move on. I've been in a situation where a crude five line script got me a critical answer in 10 minutes. I could have spent all day writing a program that would have run in 100 milliseconds, but then a bunch of people would have been wasting their time waiting. I've also been in a situation where spending several hours analyzing data flow in an inner loop allowed me to eliminate a single cycle of pipeline stall, making the difference between a program that could keep up with the data and one that couldn't. As in most engineering, there isn't a simple answer here: you have to know your requirements. If you *really* need efficiency in a digital process, you shouldn't be using a von Neumann machine at all: custom VLSI is far more efficient. Processors exist for the convenience of their human users. John Doty Noqsi Aerospace, Ltd. http://www.noqsi.com/ [EMAIL PROTECTED] _______________________________________________ geda-user mailing list geda-user@moria.seul.org http://www.seul.org/cgi-bin/mailman/listinfo/geda-user