Gavin Sherry <[EMAIL PROTECTED]> writes: > What to do? Well, as far as I can tell, there are no work arounds
This was discussed a few months ago and set aside because no one had a really decent solution at the time. The behavior is not really all that different from the discontinuities that occur around a daylight-saving transition, but people are used to those because (a) they happen every year, and (b) the bizarreness only lasts an hour and doesn't (with most DST rules) affect local midnight. The discontinuities in apparent local time at the ends of the 32-bit-time_t interval are larger and harder to miss, especially for those of you half a world away from Greenwich. The best thing I have been able to think of is to eliminate these discontinuities by changing our existing definition that says "all times outside the time_t interval (1901 to 2038 at present) are taken as GMT". We could instead define times before the interval as having the same local time offset as prevailed at the start of the interval, and likewise times after the interval have the latest time offset we can determine within the interval. Then there is no DST-like discontinuity in local time at either end of the interval. This might be too big a change in behavior though. Also, if there's anyone whose local timezone database starts in DST mode, it might seem odd for all times before 1901 to look like DST rather than local standard time. Thoughts? Note that this isn't directly connected to the idea of eliminating our dependence on the standard libc timezone routines. If we rolled our own, we'd still need to define what the behavior is outside the range of dates for which we have timezone database entries. But I suspect we'd settle on something more nearly like the above than like the existing behavior... regards, tom lane ---------------------------(end of broadcast)--------------------------- TIP 8: explain analyze is your friend