Ed Gould wrote:

>Just remember in the olden days that bytes cost money (either in storage or
in memory).

>I actually had to work on a software set of programs (system) that stripped
the sign off of all dates and money they were assumed to be positive).

 

Right, S/360 memory was a buck a byte (at some point in its life). Much
bigger bucks than today, too! So that led to the mode of thinking that said
"We can use 2-byte years" (and even single-byte, in some cases--yuck). I
expect that every Real Programmer who ever coded such a thing felt a twinge,
however; most probably said, "I'll be retired by then".

 

I was somewhat surprised that it took so long for Y2K to rise to the top,
with 30-year mortgages and such forward-looking things being far from
uncommon. But I guess those systems all got fixed one at a time as their
windows passed 2000, without enough awareness that anyone said "Hey, we need
to fix this globally, now that storage (both memory and DASD) is cheap!"

 

Another argument suggests that leaving the problem to the end was good
management: as the "crisis" neared, tools and skills were developed, plus
some applications had gone by the wayside that might have been "fixed" for
naught. So maybe it wasn't ignorance or willful neglect, but rather
deliberate. 

 

(Of course, how many companies have management that smart? [Oops, did I type
that out loud?])

-- 

...phsiii

 


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to