On 3/10/2010 8:21 PM, Russel Winder wrote:
On Sat, 2010-10-02 at 21:21 +0200, Don wrote:
retard wrote:
[ . . . ]
I meant that computers become more efficient. I've upgraded my system two
times since this discussion last appeared here. If you wait 18 months, the
20 seconds becomes 10 seconds, in 36 months 5 seconds. It's the Moore's
law, you know.

Sadly, software seems to be bloating at a rate which is faster than
Moore's law. Part of my original post noted that it was much slower than
my old 1MHz Commodore 64 took to boot my development environment from a
cassette tape! So I still take it as a good sign that the rate of
bloating is slower than Moore's law.

Faster processor speeds in the period 1950--2005 was not actually
anything to do with Moore's Law per se -- Moore's Law was about the
number of transistors per chip, not the speed of operation of those
transistors.

Since around 2005 processor speeds have stopped increasing due to
inability to deal with the heat generation.  Instead Moore's Law (which
for the moment still applies) is leading to more and more cores per chip
all running at the same speed as previously -- around 2GHz.

So the ability to improve performance of code by just waiting and buying
new kit is over -- at least for now.  If you do not turn your serial
code into parallel code there will be no mechanism for improving
performance of that code.  A bit sad for inherently serial algorithms.


And yes, my observation is that is is not often to be able to buy new
kit (aka PC) with more MIPS and chips (CPUs) that runs the O/S and
applications faster than the veteran unit, perhaps apart from graphics
acceleration.

Why is it that Moores Law does not seem to make for better user
experience as time goes by?

Conspiracy theory: Seems to me that there is a middle man on the take
all the time. :-)

Cheers
Justin Johansson


Reply via email to