On 09/02/2018 07:17 PM, Gambler wrote:

But in general, I believe the statement about comparative reliability of
tech from 1970s is true. I'm perpetually impressed with is all the
mainframe software that often runs mission-critical operations in places
you would least expect.

I suspect it may be because, up until around the 90's, in order to get any code successfully running on the computer at all, you pretty much had to know at least a thing or two about how a computer works and how to use it. And performance/efficiency issues were REALLY obvious. Not to mention the institutional barriers to entry: Everyone didn't just have a computer in their pocket, or even in their den at home.

(Plus the machines themselves tended to be simpler: It's easier to write good code when a single programmer can fully understand every byte of the machine and their code is all that's running.)

In the 50's/60's in particular, I imagine a much larger percentage of programmers probably had either some formal engineering background or something equally strong.

But now, pretty much anyone can (and often will) cobble together something that more-or-less runs. Ie, there used to be a stronger barrier to entry, and the machines/tools tended to be less tolerant of problems.

Reply via email to