Faramir wrote: > Well... just an example: some time ago, the Open Document Format
The ODF-OOXML debate really has very little to do with date and time standards. If there was an obviously correct way of doing things, both document formats would support it. The problem tends to be this: how do you define "time", and how ought it be incremented? If you ask a person in the street how long a year is, they'll say 365 days. If they're bright, they'll say 365 and a quarter. But the reality is leap years only apply in years evenly divisible by four and _not_ divisible by 25, with the exception of years evenly divisible by 400. (No, I'm not kidding. This is why 2000 was a leap year, but 1900 wasn't.) And then we get into the question of leap seconds. Where should they be placed? How should they be accounted for? That's not even addressing questions like how to make a calendar that caters to our Gregorian calendar, but can also handle the Jewish and Islamic calendars, which are defined not in terms of absolute units of time but in terms of astronomical events. E.g., in the Gregorian calendar it's pretty easy to tell whether a date falls on the weekend. In the Jewish calendar, the Sabbath begins at sundown on what the Gregorian calendar would call Friday and continues until the appearance of three stars in the sky on Saturday night (!). Hence, dates in the Jewish calendar depend not only on your latitude and season, but also on local weather conditions and light pollution. (Anyone who says "... well, yeah, but that's an obviously crazy calendar standard, so we shouldn't care about it" will be roundly thwacked. Given how crazy the Gregorian calendar has occasionally been, including downright _missing a couple of weeks_ once, the Gregorian calendar does not exactly have much room to criticize.) ===== On top of that, there are technical issues. If you're just tracking seconds since an arbitrary point in time, how do you increment this clock to adjust for leap seconds? Do you actually increment the clock, or do you make a note somewhere "the actual time is now offset by a leap second; the amount of time since Epoch hasn't really changed, though"? What range of values can the since-Epoch value hold? Most UNIXes hold it as a 32-bit signed integer, meaning January 1 2038 we're going to see a lot of legacy applications crash. We could switch it to a 64-bit value, but this is kind of contentious for various reasons (mostly, IMO, personal prejudice masquerading as technical objections). What about applications that need to keep rigorous track of time? For instance, the UNIX seconds-since-Epoch date/time format is pretty poorly suited for our modern environment, where GPS satellites need nanosecond accuracy, and relativistic effects have to be considered for essentially all satellite communications. ===== Seconds since Epoch is just a bad date/time format, there's no two ways about it. But then again, _all_ the date/time formats are bad. What seconds-since-Epoch has going for it is that it's dead simple and everyone understands it. Those are two of its strengths, and for that reason it's not going away anytime soon. ... And on this note, I'm going to stop rambling on this increasingly off-topic subject. Hopefully this is a good overview of why programmers hate all the date/time formats out there, and just how tough it is to do it right. :) _______________________________________________ Gnupg-users mailing list Gnupg-users@gnupg.org http://lists.gnupg.org/mailman/listinfo/gnupg-users