On Wed, 14 Aug 2019 17:21:06 +0000, Seymour J Metz wrote:

>There were other options to reduce the storage requirement of a date, e.g., 
>store them in binary.
> 
In some cases, dates have been stored in two-byte signed decimal, biased
by -1900, supporting dates through 2899 with minimal code change.

>________________________________________
>From: Jesse 1 Robinson 
>Sent: Wednesday, August 14, 2019 12:10 PM
>
>-- Can't cite attribution, but I remember the calculation that despite our 
>late 1990s poignant misery, the ancient choice to represent dates with two 
>digits was actually economically correct. The burdensome cost of both media 
>and memory storage in, say, 1970, outweighed on balance the eventual cost of 
>remediation. It's easy to ask what difference two bytes would have made, but 
>the hard-money cost of billions and billions of 'extra' bytes would have been 
>substantial.
> 
Me, too.  However I doubt that many organizations invested those savings
in reliable securities to be redeemed for the eventual cost of remediation.

In the mid 80s, I suggested that a product we were designing store 4-digit
years in anticipation of Y2K.  Overruled by manager:
o No then available OS interface reported 4-digit years.
o No need; who cares?!

-- gil

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to