I wonder whether Alan Kay is the author of an article I read during the late 
'70s in PC Computing; the name sounds familiar.  I made copies and kept them 
for a while, but I've lost track of them now.  In that article the writer spoke 
of teaching students who came not knowing much about computer programming but 
being strong and certain on one point, that computer people were rigorously 
logical.  Then he had to disillusion them.

He mentioned their dismay at learning of the existence of TWO schemes, ASCII 
and EBCDIC, with different sort results.  Another of his points that sticks in 
my memory is a description of a chip that used 5 bits for instructions and had 
32 different instructions.  "Maybe it isn't immediately obvious, but that's an 
error in design.  It means that every possible bit combination is a valid 
instruction, so when your program has an error, the probability of it ending 
anywhere near the error point is nil."

I've often bewailed the loss of that article; many of its observations are 
still relevant today.

---
Bob Bridges, robhbrid...@gmail.com, cell 336 382-7313

/* I know everyone thinks Republicans aren't funny. But if you get a bunch of 
us together, we can be a real riot.  -Nancy Mace at Washington Press Club */

________________________________________
From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> on behalf of 
Rupert Reynolds <rreyno...@cix.co.uk>
Sent: Saturday, March 16, 2024 2:39 PM

Alan Kay described computing as a 'pop culture' and I see his point--we don't 
often learn from history and we often re-invent things and 'solve' problems 
that have already been solved before :-)

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to