[Default] On 14 Aug 2019 15:50:58 -0700, in bit.listserv.ibm-main
jesse1.robin...@sce.com (Jesse 1 Robinson) wrote:

>It's a modern day cottage industry--or hobby maybe--to excoriate our foremoms 
>and foredads for the reckless choice they made decades ago to store dates in 
>two-digit format. Making our lives miserable in the process. OTOH I remember 
>reading some diary excerpts from US Civil War soldiers who routinely recorded 
>dates as 61, 63, and so on. 
>
>Suppose you were an influencer in, say, 1975. Would you walk into an 
>application design meeting and propose *any* change to date representation? 
>There were already countless date fields stored in other intersecting 
>applications. Those would have to change also or be interfaced only with 
>conversion routines. 
>
>I think the only point in IT history where a radical change actually made 
>sense was just when it happened: right ahead of the wrecking ball. And rather 
>than saturate the landscape with a gajillion *useless* year digits, many 
>companies were content to implement sliding windows that permanently solved 
>the problem with minimal extra storage space.  
>
As someone involved in a Year 2000 project where I both validated the
functioning of the windows used by one software vendor and wrote the
date subroutine used for 4 digit year date handling, I guarantee
sliding or fixed windows do not necessarily solve the problem for all
time.  The sliding window may well do it going forward depending on
the data other than for year 00 and if keyed against a current 4 digit
year handle that it might even do that.  However, the window only
works where the range of dates in the active data is 100 years or less
and would fail on birth dates for all of the centenarians among us
including a retired pastor of my church. Using sliding windows could
be dicey when dealing with historic data.  In general, each
application should be reviewed to see if long term 2 digit years can
lead to wrong results regardless of windowing provision chosen.

Clark Morris 
>.
>.
>J.O.Skip Robinson
>Southern California Edison Company
>Electric Dragon Team Paddler 
>SHARE MVS Program Co-Manager
>323-715-0595 Mobile
>626-543-6132 Office ?=== NEW
>robin...@sce.com
>
>-----Original Message-----
>From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> On Behalf Of 
>Clark Morris
>Sent: Wednesday, August 14, 2019 3:29 PM
>To: IBM-MAIN@LISTSERV.UA.EDU
>Subject: (External):Reason for 2 digit years was Re: Instruction speeds
>
>[Default] On 14 Aug 2019 10:21:17 -0700, in bit.listserv.ibm-main 
>sme...@gmu.edu (Seymour J Metz) wrote:
>
>>There were other options to reduce the storage requirement of a date, e.g., 
>>store them in binary.
>>
>The conversion to and from binary would have been costly in CPU time and for 
>dates stored as packed decimal 0yymmdds the use of the high order nibble would 
>have worked at the cost of complexity.  I suspect that the real saving was in 
>data entry and the desire to fit as much information on one 80 byte punch card 
>as well as on to a 132 character print line.  I note that my credit cards 
>still use 2 digit years.
>
>Clark Morris  
>>
>>--
>>Shmuel (Seymour J.) Metz
>>http://mason.gmu.edu/~smetz3
>>
>>________________________________________
>>From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> on 
>>behalf of Jesse 1 Robinson <jesse1.robin...@sce.com>
>>Sent: Wednesday, August 14, 2019 12:10 PM
>>To: IBM-MAIN@LISTSERV.UA.EDU
>>Subject: Re: Instruction speeds
>>
>>A couple of observations on Y2K accommodation.
>>
>>-- As my shop was slogging through remediation required for year 2000, 
>>insurance companies apparently coasted along because they had ALWAYS needed 
>>to handle four-digit years from the inception of IT. For them it was business 
>>as usual.
>>
>>-- Can't cite attribution, but I remember the calculation that despite our 
>>late 1990s poignant misery, the ancient choice to represent dates with two 
>>digits was actually economically correct. The burdensome cost of both media 
>>and memory storage in, say, 1970, outweighed on balance the eventual cost of 
>>remediation. It's easy to ask what difference two bytes would have made, but 
>>the hard-money cost of billions and billions of 'extra' bytes would have been 
>>substantial.
>>
>>.
>>.
>>J.O.Skip Robinson
>>Southern California Edison Company
>>Electric Dragon Team Paddler
>>SHARE MVS Program Co-Manager
>>323-715-0595 Mobile
>>626-543-6132 Office ?=== NEW
>>robin...@sce.com
>>
>>-----Original Message-----
>>From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> On 
>>Behalf Of Seymour J Metz
>>Sent: Wednesday, August 14, 2019 7:49 AM
>>To: IBM-MAIN@LISTSERV.UA.EDU
>>Subject: (External):Re: Instruction speeds
>>
>>> That assumes that you know what is unnecessary. The smart money says that 
>>> the unnecessary code will turn out to be necessary, at the least convenient 
>>> time.
>>
>>> A nice example is how to determine leap years: from as long as I program 
>>> the flow is:
>>>- dividable by 4?
>>>- dividable by 100?
>>>- dividable by 400?
>>
>>The last 2 are completely unnecessary until the year 2100.
>>
>>And in the year 2100 people will curse you for deciding that it's unnecessary.
>>
>>Après Moi Le Déluge (Après nous le deluge for purists.)
>
>
>----------------------------------------------------------------------
>For IBM-MAIN subscribe / signoff / archive access instructions,
>send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to