Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Nathaniel Smith
On Thu, May 17, 2018 at 9:49 AM, Chris Barker via Python-ideas
 wrote:
> On Tue, May 15, 2018 at 11:21 AM, Rob Speer  wrote:
>>
>>
>> I'm sure that the issue of "what do you call the leap second itself" is
>> not the problem that Chris Barker is referring to. The problem with leap
>> seconds is that they create unpredictable differences between UTC and real
>> elapsed time.
>>
>> You can represent a timedelta of exactly 10^8 seconds, but if you add it
>> to the current time, what should you get? What UTC time will it be in 10^8
>> real-time seconds? You don't know, and neither does anybody else, because
>> you don't know how many leap seconds will occur in that time.
>
>
> indeed -- even if you only care about the past, where you *could* know the
> leap seconds -- they are, by their very nature, of second precision -- which
> means right before leap second occurs, your "time" could be off by up to a
> second (or a half second?)

Not really. There are multiple time standards in use. Atomic clocks
count the duration of time – from their point of view, every second is
the same (modulo relativistic effects). TAI is the international
standard based on using atomic clocks to count seconds since a fixed
starting point, at mean sea level on Earth.

Another approach is to declare that each day (defined as "the time
between the sun passing directly overhead the Greenwich Observatory
twice") is 24 * 60 * 60 seconds long. This is what UT1 does. The
downside is that since the earth's rotation varies over time, this
means that the duration of a UT1 second varies from day to day in ways
that are hard to estimate precisely.

UTC is defined as a hybrid of these two approaches: it uses the same
seconds as TAI, but every once in a while we add or remove a leap
second to keep it roughly aligned with UT1. This is the time standard
that computers use the vast majority of the time. Importantly, since
we only ever add or remove an integer number of seconds, and only at
the boundary in between seconds, UTC is defined just as precisely as
TAI.

So if you're trying to measure time using UT1 then yeah, your computer
clock is wrong all the time by up to 0.9 seconds, and we don't even
know what UT1 is more precisely than ~milliseconds. Generally it gets
slightly more accurate just after a leap second, but it's not very
precise either before or after. Which is why no-one does this.

But if you're trying to measure time using UTC, then computers with
the appropriate setup (e.g. at CERN, or in HFT data centers) routinely
have clocks accurate to <1 microsecond, and leap seconds don't affect
that at all.

The datetime module still isn't appropriate for doing precise
calculations over periods long enough to include a leap second though,
e.g. Python simply doesn't know how many seconds passed between two
arbitrary UTC timestamps, even if they were in the past.

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Alexander Belopolsky
On Thu, May 17, 2018 at 7:12 PM Wes Turner  wrote:

> AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA)
> library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with
> AstroPy. The latest IERS-A tables ("from 1973 though one year into the
> future") auto-download on first use [5].
>

I've just tried it.  Unfortunately, it does not seem to be compatible with
PEP 495 datetime yet:

>>> t = astropy.time.Time('2016-12-31T23:59:60')
>>> t.to_datetime()
Traceback (most recent call last):
 ...
ValueError: Time (array(2016, dtype=int32), array(12, dtype=int32),
array(31, dtype=int32), array(23, dtype=int32), array(59, dtype=int32),
array(60, dtype=int32), array(0, dtype=int32)) is within a leap second but
datetime does not support leap seconds

Maybe someone can propose a feature for astropy to return
datetime(2016,12,31,23,59,59,fold=1) in this case.


>
> [1] http://docs.astropy.org/en/stable/time/#time-scales-for-time-deltas
> [2] http://docs.astropy.org/en/stable/time/#writing-a-custom-format
> [3] "Leap second day utc2tai interpolation"
> https://github.com/astropy/astropy/issues/5369
> [4] https://github.com/astropy/astropy/pull/4436
> [5] http://docs.astropy.org/en/stable/utils/iers.html
>
>>
>>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Wes Turner
https://en.wikipedia.org/wiki/Leap_second :

> Insertion of each UTC leap second is usually decided about six months in
advance by the International Earth Rotation and Reference Systems Service
(IERS), when needed to ensure that the difference between the UTC and UT1
readings will never exceed 0.9 seconds

On Thursday, May 17, 2018, Wes Turner  wrote:

> AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA)
> library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with
> AstroPy. The latest IERS-A tables ("from 1973 though one year into the
> future") auto-download on first use [5].
>
> [1] http://docs.astropy.org/en/stable/time/#time-scales-for-time-deltas
> [2] http://docs.astropy.org/en/stable/time/#writing-a-custom-format
> [3] "Leap second day utc2tai interpolation"
> https://github.com/astropy/astropy/issues/5369
> [4] https://github.com/astropy/astropy/pull/4436
> [5] http://docs.astropy.org/en/stable/utils/iers.html
>
>
> On Thursday, May 17, 2018, Alexander Belopolsky <
> alexander.belopol...@gmail.com> wrote:
>
>>
>>
>> On Thu, May 17, 2018 at 3:13 PM Tim Peters  wrote:
>>
>>> [Chris Barker]
>>> > Does that support the other way -- or do we never lose a leap second
>>> anyway?
>>> > (showing ignorance here)
>>>
>>> Alexander covered the Python part of this,  ...
>>>
>>
>> No, I did not.  I did not realize that the question was about skipping a
>> second instead of inserting it.  Yes, regardless of whether it is possible
>> given the physics of Earth rotation, negative leap seconds can be
>> supported.  They simply become "gaps" in PEP 495 terminology.  Check out
>> PEP 495 and read "second" whenever you see "hour". :-)
>>
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Wes Turner
AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA)
library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with
AstroPy. The latest IERS-A tables ("from 1973 though one year into the
future") auto-download on first use [5].

[1] http://docs.astropy.org/en/stable/time/#time-scales-for-time-deltas
[2] http://docs.astropy.org/en/stable/time/#writing-a-custom-format
[3] "Leap second day utc2tai interpolation"
https://github.com/astropy/astropy/issues/5369
[4] https://github.com/astropy/astropy/pull/4436
[5] http://docs.astropy.org/en/stable/utils/iers.html


On Thursday, May 17, 2018, Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:

>
>
> On Thu, May 17, 2018 at 3:13 PM Tim Peters  wrote:
>
>> [Chris Barker]
>> > Does that support the other way -- or do we never lose a leap second
>> anyway?
>> > (showing ignorance here)
>>
>> Alexander covered the Python part of this,  ...
>>
>
> No, I did not.  I did not realize that the question was about skipping a
> second instead of inserting it.  Yes, regardless of whether it is possible
> given the physics of Earth rotation, negative leap seconds can be
> supported.  They simply become "gaps" in PEP 495 terminology.  Check out
> PEP 495 and read "second" whenever you see "hour". :-)
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Alexander Belopolsky
On Thu, May 17, 2018 at 3:13 PM Tim Peters  wrote:

> [Chris Barker]
> > Does that support the other way -- or do we never lose a leap second
> anyway?
> > (showing ignorance here)
>
> Alexander covered the Python part of this,  ...
>

No, I did not.  I did not realize that the question was about skipping a
second instead of inserting it.  Yes, regardless of whether it is possible
given the physics of Earth rotation, negative leap seconds can be
supported.  They simply become "gaps" in PEP 495 terminology.  Check out
PEP 495 and read "second" whenever you see "hour". :-)
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Chris Barker via Python-ideas
now we really have gotten OT...

But thanks! that was my question!

-CHB


Alexander covered the Python part of this, so I'll answer the possible
> higher-level question:  we haven't yet needed a "negative" leap
> second, and it's considered unlikely (but not impossible) that we ever
> will.  That's because the Earth's rotation is inexorably slowing ,so
> the mean solar day inexorably lengthens when measured by SI seconds.
>
> Other things can cause the Earth's rotation to speed up temporarily
> (like some major geological events), but they've only been able to
> overcome factors acting to slow rotation for brief periods, and never
> yet got near to overcoming them by a full second.
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Tim Peters
[Chris Barker]
> Does that support the other way -- or do we never lose a leap second anyway?
> (showing ignorance here)

Alexander covered the Python part of this, so I'll answer the possible
higher-level question:  we haven't yet needed a "negative" leap
second, and it's considered unlikely (but not impossible) that we ever
will.  That's because the Earth's rotation is inexorably slowing ,so
the mean solar day inexorably lengthens when measured by SI seconds.

Other things can cause the Earth's rotation to speed up temporarily
(like some major geological events), but they've only been able to
overcome factors acting to slow rotation for brief periods, and never
yet got near to overcoming them by a full second.
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Alexander Belopolsky
On Thu, May 17, 2018 at 2:51 PM Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:

>
> TAI  | UTC
> -+
> 2016-12-31T23:59:35  | 2016-12-31T23:59:59
> 2016-12-31T23:59:36  | 2016-12-31T23:59:60
> 2016-12-31T23:59:37  | 2017-01-01T00:00:00
>
> this correspondence can be implemented in Python using the following
> datetime objects:
>
> TAI| UTC
> ---+---
> datetime(2016,12,31,23,59,35)  | datetime(2016,12,31,23,59,59)
> datetime(2016,12,31,23,59,36)  | datetime(2016,12,31,23,59,59,fold=1)
> datetime(2016,12,31,23,59,37)  | datetime(2017,1,1,0,0,0)
>
>
>
Correction: 2016-01-01 in the tables I presented before should be read as
2017-01-01 and similarly for the datetime fields.
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Alexander Belopolsky
On Thu, May 17, 2018 at 1:33 PM Chris Barker  wrote:
>
> On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:
>>  [...] Since the implementation of PEP 495, it is
>> possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set.
Of
>> course, the repeated 23:59:59 will be displayed and behave exactly the
same
>> as the first 23:59:59, but a 3rd party library can be written to take the
>> "fold" bit into account in temporal operations.
>
>
> Does that support the other way -- or do we never lose a leap second
anyway? (showing ignorance here)
>

I am not sure I understand your question.  All I said was that since PEP
495, it became possible to write a pair of functions to convert between TAI
and UTC timestamps without any loss of information.

For example, around the insertion  of the last leap second at the end of
2016, we had the following sequence of seconds:

TAI  | UTC
-+
2016-12-31T23:59:35  | 2016-12-31T23:59:59
2016-12-31T23:59:36  | 2016-12-31T23:59:60
2016-12-31T23:59:37  | 2016-01-01T00:00:00

this correspondence can be implemented in Python using the following
datetime objects:

TAI| UTC
---+---
datetime(2016,12,31,23,59,35)  | datetime(2016,12,31,23,59,59)
datetime(2016,12,31,23,59,36)  | datetime(2016,12,31,23,59,59,fold=1)
datetime(2016,12,31,23,59,37)  | datetime(2016,1,1,0,0,0)


Of course, Python will treat datetime(2016,12,31,23,59,59) and datetime(
2016,12,31,23,59,59,fold=1)as equal, but you should be able to use your
utc_to_tai(t) function to translate to TAI, do the arithmetic there and
translate back with the tai_to_utc(t) function.  Wherever tai_to_utc(t)
returns a datetime instance with fold=1, you should add that to the seconds
field before displaying.

> But still, now datetime *could* support leap seconds (which is nice,
because before, 23:59:60 was illegal, so it couldn't even be done at all),
but that doesn't mean that it DOES support leap seconds

By the same logic the standard library datetime does not support any local
time because it does not include the timezone database.  This is where the
3rd party developers should fill the gap.
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Chris Barker via Python-ideas
On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:

> > The other issue with leap-seconds is that python's datetime doesn't
> support them :-)
>
> That's not entirely true.  Since the implementation of PEP 495, it is
> possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set.  Of
> course, the repeated 23:59:59 will be displayed and behave exactly the same
> as the first 23:59:59, but a 3rd party library can be written to take the
> "fold" bit into account in temporal operations.
>

Does that support the other way -- or do we never lose a leap second
anyway? (showing ignorance here)

But still, now datetime *could* support leap seconds (which is nice,
because before, 23:59:60 was illegal, so it couldn't even be done at all),
but that doesn't mean that it DOES support leap seconds

-CHB

-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Alexander Belopolsky
On Thu, May 17, 2018 at 12:56 PM Chris Barker via Python-ideas <
python-ideas@python.org> wrote:

> The other issue with leap-seconds is that python's datetime doesn't
support them :-)

That's not entirely true.  Since the implementation of PEP 495, it is
possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set.  Of
course, the repeated 23:59:59 will be displayed and behave exactly the same
as the first 23:59:59, but a 3rd party library can be written to take the
"fold" bit into account in temporal operations.
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-17 Thread Chris Barker via Python-ideas
On Tue, May 15, 2018 at 11:21 AM, Rob Speer  wrote:

>
> I'm sure that the issue of "what do you call the leap second itself" is
> not the problem that Chris Barker is referring to. The problem with leap
> seconds is that they create unpredictable differences between UTC and real
> elapsed time.
>
> You can represent a timedelta of exactly 10^8 seconds, but if you add it
> to the current time, what should you get? What UTC time will it be in 10^8
> real-time seconds? You don't know, and neither does anybody else, because
> you don't know how many leap seconds will occur in that time.
>

indeed -- even if you only care about the past, where you *could* know the
leap seconds -- they are, by their very nature, of second precision --
which means right before leap second occurs, your "time" could be off by up
to a second (or a half second?)

It's kind of like using a carpenter's tape measure to to locate points from
a electron microscope scan :-)

The other issue with leap-seconds is that python's datetime doesn't support
them :-)

And neither do most date-time libraries.

-CHB


-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-15 Thread Rob Speer
On Mon, 14 May 2018 at 12:17 Chris Angelico  wrote:

> On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas
>  wrote:
> > But my question is whether high precision timedeltas belongs with
> "calendar
> > time" at all.
> >
> > What with UTC and leap seconds, and all that, it gets pretty ugly, when
> down
> > to the second or sub-second, what a given datetime really means.
>
> UTC and leap seconds aren't a problem. When there's a leap second, you
> have 23:59:60 (or you repeat 23:59:59, if you can't handle second
> #60). That's pretty straight-forward, perfectly well-defined.
>

I'm sure that the issue of "what do you call the leap second itself" is not
the problem that Chris Barker is referring to. The problem with leap
seconds is that they create unpredictable differences between UTC and real
elapsed time.

You can represent a timedelta of exactly 10^8 seconds, but if you add it to
the current time, what should you get? What UTC time will it be in 10^8
real-time seconds? You don't know, and neither does anybody else, because
you don't know how many leap seconds will occur in that time.

The ways to resolve this problem are:
(1) fudge the definition of "exactly 10^8 seconds" to disregard any leap
seconds that occur in that time interval in the real world, making it not
so exact anymore
(2) use TAI instead of UTC, as GPS systems do
(3) leave the relationship between time deltas and calendar time undefined,
as some in this thread are suggesting
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-14 Thread Wes Turner
>From "[Python-Dev] PEP 564: Add new time functions with nanosecond
resolution" (2017-10-16 hh:mm ss[...] -Z)
https://groups.google.com/forum/m/#!topic/dev-python/lLJuW_asYa0 :

> Maybe that's why we haven't found any CTCs (closed timelike curves) yet.
>
> Aligning simulation data in context to other events may be enlightening:
is there a good library for handing high precision time units in Python
(and/or CFFI)?

There's not yet an ISO8601-like standard for this level of time/date
precision.

Correlating particle events between experiments does require date+time.

On Monday, May 14, 2018, David Mertz  wrote:

> Chris is certainly right. A program that deals with femtosecond intervals
> should almost surely start by defining a "start of experiment" epoch where
> microseconds are fine. Then within that epoch, events should be monotonic
> integers for when measured or calculated times are marked.
>
> I can easily see reasons why a specialized wrapped int for
> FemtosecondsFromStart could be useful. But that's still a specialized need
> for a third party library. One possible use of this class might be to
> interoperate with datetimes or timedeltas. Conceivably sick
> interoperability could be dealing with leap seconds when needed. But
> "experiment time" should be a simple monotonic and uniform counter.
>
> On Mon, May 14, 2018, 6:35 PM Chris Barker - NOAA Federal via Python-ideas
>  wrote:
>
>> >
>> > UTC and leap seconds aren't a problem.
>>
>> Of course they are a problem— why else would they not be implemented
>> in datetime?
>>
>> But my point if that given datetimestamp or calculation could be off
>> by a second or so depending on whether and how leap seconds are
>> implemented.
>>
>> It just doesn’t seem like a good idea to be handling months and
>> femptoseconds with the same “encoding”
>>
>> -CHB
>> ___
>> Python-ideas mailing list
>> Python-ideas@python.org
>> https://mail.python.org/mailman/listinfo/python-ideas
>> Code of Conduct: http://python.org/psf/codeofconduct/
>>
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-14 Thread David Mertz
Chris is certainly right. A program that deals with femtosecond intervals
should almost surely start by defining a "start of experiment" epoch where
microseconds are fine. Then within that epoch, events should be monotonic
integers for when measured or calculated times are marked.

I can easily see reasons why a specialized wrapped int for
FemtosecondsFromStart could be useful. But that's still a specialized need
for a third party library. One possible use of this class might be to
interoperate with datetimes or timedeltas. Conceivably sick
interoperability could be dealing with leap seconds when needed. But
"experiment time" should be a simple monotonic and uniform counter.

On Mon, May 14, 2018, 6:35 PM Chris Barker - NOAA Federal via Python-ideas <
python-ideas@python.org> wrote:

> >
> > UTC and leap seconds aren't a problem.
>
> Of course they are a problem— why else would they not be implemented
> in datetime?
>
> But my point if that given datetimestamp or calculation could be off
> by a second or so depending on whether and how leap seconds are
> implemented.
>
> It just doesn’t seem like a good idea to be handling months and
> femptoseconds with the same “encoding”
>
> -CHB
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-14 Thread Chris Barker - NOAA Federal via Python-ideas
>
> UTC and leap seconds aren't a problem.

Of course they are a problem— why else would they not be implemented
in datetime?

But my point if that given datetimestamp or calculation could be off
by a second or so depending on whether and how leap seconds are
implemented.

It just doesn’t seem like a good idea to be handling months and
femptoseconds with the same “encoding”

-CHB
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-14 Thread Chris Angelico
On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas
 wrote:
> But my question is whether high precision timedeltas belongs with "calendar
> time" at all.
>
> What with UTC and leap seconds, and all that, it gets pretty ugly, when down
> to the second or sub-second, what a given datetime really means.

UTC and leap seconds aren't a problem. When there's a leap second, you
have 23:59:60 (or you repeat 23:59:59, if you can't handle second
#60). That's pretty straight-forward, perfectly well-defined.

No, the REAL problems come from relativity.

> If I were to work with high precision measurements, experiments, etc, I'd
> use a "nanoseconds since" representation, where the "epoch" would likely be
> the beginning of the experiment, of something relevant.

That's an unrelated form of time calculation. For that kind of thing,
you probably want to ignore calendars and use some form of monotonic
time; but also, if you want to go to (or below) nanosecond resolution,
you'll need your clock to actually be that accurate, which most likely
means you're not using a computer's clock. Femtosecond timestamping
would basically be just taking numbers given to you by an external
device and using them as sequence points - clocks and calendars become
irrelevant. The numbers might as well be frame numbers in a
super-high-speed filming of the event.

ChrisA
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-14 Thread Chris Barker via Python-ideas
On Thu, May 10, 2018 at 6:13 PM, Alexander Belopolsky <
alexander.belopol...@gmail.com> wrote:

> > Is there interest in a PEP for extending time, datetime / timedelta for
> arbitrary or extended precision fractional seconds?
>
> Having seen the utter disaster that similar ideas brought to numpy, I would
> say: no.
>

I'm not sure the "disaster" was due to this idea nor, frankly, is
datetime64 a disaster at all, though certainly far from perfect.

But my question is whether high precision timedeltas belongs with "calendar
time" at all.

What with UTC and leap seconds, and all that, it gets pretty ugly, when
down to the second or sub-second, what a given datetime really means.

If I were to work with high precision measurements, experiments, etc, I'd
use a "nanoseconds since" representation, where the "epoch" would likely be
the beginning of the experiment, of something relevant.

Note that this issued in netcdf CF formats, datetimes are expressed in
things like:

"hours since 1970-01-01:00:00"

granted, it's mostly so that the values can be stored as an array of a
simple scalars, but it does allow precision and an epoch that are suited to
the data at hand.

NOTE: One source of the "disaster" of numpy's datetime64 is you can set teh
precision, but NOT the epoch -- which is kind of problematic if you really
want femtosecond precision for something not in 1970 :-)

-CHB











> On the other hand, nanoseconds are slowly making their way to the stdlib
> and to add nanoseconds to datetime we only need a fully backward compatible
> implementation, not even a PEP.
>
> See .
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread Nathaniel Smith
You don't mention the option of allowing time.microseconds to be a
float, and I was curious about that since if it did work, then that
might be a relatively smooth extension of the current API. The highest
value you'd store in the microseconds field is 1e6, and at values
around 1e6, double-precision floating point has precision of about
1e-10:

In [8]: 1e6 - np.nextafter(1e6, 0)
Out[8]: 1.1641532182693481e-10

So that could represent values to precision of ~0.116 femtoseconds, or
116 attoseconds. Too bad. Femtosecond precision would cover a lot of
cases, if you really need attoseconds then it won't work.

-n


On Thu, May 10, 2018 at 1:30 PM, Ed Page  wrote:
> Greetings,
>
> Is there interest in a PEP for extending time, datetime / timedelta for 
> arbitrary or extended precision fractional seconds?
>
> My company designs and manufactures scientific hardware that typically 
> operate with nanoseconds -- sometimes even attoseconds -- levels of 
> precision.  We’re in the process of providing Python APIs for some of these 
> products and need  to expose the full accuracy of the data to our customers.  
> Doing so would allow developers to do things like timestamp analog 
> measurements for correlating with other events in their system, or precisely 
> schedule a future time event for correctly interoperating  with other 
> high-speed devices.
>
> The API we’ve been toying with is adding two new fields to time, datetime and 
> timedelta
> - frac_seconds (int)
> - frac_seconds_exponent (int or new SITimeUnit enum)
>
> time.microseconds would be turned into a property that wraps frac_seconds for 
> compatibility
>
> Challenges
> - Defining the new `max` or `resolution`
> - strftime / strptime.  I propose that we do nothing, just leave formatting / 
> parsing to use `microseconds` at best.  On the other hand, __str__ could just 
> specify the fractional seconds using scientific or engineering notation.
>
> Alternatives
> - My company create our own datetime library
>   - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, 
> delorean, datetime64, pandas.Timestamp – all of which offer varying degrees 
> of compatibility)
> - Add an `attosecond` field and have `microsecond` wrap this.
>   - Effectively same except hard code `frac_seconds_exponent` to lowest value
>   - The most common cases (milliseconds, microseconds) will always pay the 
> cost of using a bigint as compared to the proposal which is a "pay for what 
> you use" approach
>   - How do we define what is "good enough" precision?
> - Continue to subdivide time by adding `nanosecond` that is "nanoseconds 
> since last micosecond", `picosecond` that is "picoseconds since last 
> micnanosecond", and  `attosecond` field that is "attoseconds since last 
> picosecond"
>   - Possibly surprising API; people might expect `picosecond` to be an offset 
> since last second
>   - Messy base 10 / base 2 conversions
> - Have `frac_seconds` be a float
>   - This has precision issues.
>
> If anyone wants to have an impromptu BoF on the subject, I'm available at 
> PyCon.
>
> Thanks
> Ed Page
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/



-- 
Nathaniel J. Smith -- https://vorpus.org
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread David Mertz
In fairness, Pandas, datetime64, and Arrow are really the same thing. I
don't know about Pendulum or Delorean. A common standard would be great, or
at least strong interoperability. I'm sure the authors of those projects
would want that... Arrow is entirely about interoperability, after all.

On Thu, May 10, 2018, 7:11 PM Ethan Furman  wrote:

> On 05/10/2018 10:30 AM, Ed Page wrote:
>
> > Alternatives
> > - My company create our own datetime library
> >- Continued fracturing of time ... ecosystem (datetime, arrow,
> pendulum, delorean, datetime64, pandas.Timestamp
>
> Or, team up with one of those (if you can).
>
> --
> ~Ethan~
>
>
>
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread Ethan Furman

On 05/10/2018 10:30 AM, Ed Page wrote:


Alternatives
- My company create our own datetime library
   - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, 
delorean, datetime64, pandas.Timestamp


Or, team up with one of those (if you can).

--
~Ethan~



___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread Alexander Belopolsky
> Is there interest in a PEP for extending time, datetime / timedelta for
arbitrary or extended precision fractional seconds?

Having seen the utter disaster that similar ideas brought to numpy, I would
say: no.

On the other hand, nanoseconds are slowly making their way to the stdlib
and to add nanoseconds to datetime we only need a fully backward compatible
implementation, not even a PEP.

See .
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread Guido van Rossum
I have to agree with David that this seems too specialized to make room for
in the stdlib.

On Thu, May 10, 2018, 15:16 David Mertz  wrote:

> This feels specialized enough to belong in a third party library. If that
> library can behave as transparently as possible interacting with Python
> datetime, so much the better. But the need is niche enough I don't think it
> belongs in standard library.
>
> ... this as someone who actually worked in a lab that measured MD
> simulations in attoseconds. I do understand the purpose.
>
> On Thu, May 10, 2018, 2:00 PM Ed Page  wrote:
>
>> Greetings,
>>
>> Is there interest in a PEP for extending time, datetime / timedelta for
>> arbitrary or extended precision fractional seconds?
>>
>> My company designs and manufactures scientific hardware that typically
>> operate with nanoseconds -- sometimes even attoseconds -- levels of
>> precision.  We’re in the process of providing Python APIs for some of these
>> products and need  to expose the full accuracy of the data to our
>> customers.  Doing so would allow developers to do things like timestamp
>> analog measurements for correlating with other events in their system, or
>> precisely schedule a future time event for correctly interoperating  with
>> other high-speed devices.
>>
>> The API we’ve been toying with is adding two new fields to time, datetime
>> and timedelta
>> - frac_seconds (int)
>> - frac_seconds_exponent (int or new SITimeUnit enum)
>>
>> time.microseconds would be turned into a property that wraps frac_seconds
>> for compatibility
>>
>> Challenges
>> - Defining the new `max` or `resolution`
>> - strftime / strptime.  I propose that we do nothing, just leave
>> formatting / parsing to use `microseconds` at best.  On the other hand,
>> __str__ could just specify the fractional seconds using scientific or
>> engineering notation.
>>
>> Alternatives
>> - My company create our own datetime library
>>   - Continued fracturing of time ... ecosystem (datetime, arrow,
>> pendulum, delorean, datetime64, pandas.Timestamp – all of which offer
>> varying degrees of compatibility)
>> - Add an `attosecond` field and have `microsecond` wrap this.
>>   - Effectively same except hard code `frac_seconds_exponent` to lowest
>> value
>>   - The most common cases (milliseconds, microseconds) will always pay
>> the cost of using a bigint as compared to the proposal which is a "pay for
>> what you use" approach
>>   - How do we define what is "good enough" precision?
>> - Continue to subdivide time by adding `nanosecond` that is "nanoseconds
>> since last micosecond", `picosecond` that is "picoseconds since last
>> micnanosecond", and  `attosecond` field that is "attoseconds since last
>> picosecond"
>>   - Possibly surprising API; people might expect `picosecond` to be an
>> offset since last second
>>   - Messy base 10 / base 2 conversions
>> - Have `frac_seconds` be a float
>>   - This has precision issues.
>>
>> If anyone wants to have an impromptu BoF on the subject, I'm available at
>> PyCon.
>>
>> Thanks
>> Ed Page
>> ___
>> Python-ideas mailing list
>> Python-ideas@python.org
>> https://mail.python.org/mailman/listinfo/python-ideas
>> Code of Conduct: http://python.org/psf/codeofconduct/
>>
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


Re: [Python-ideas] High Precision datetime

2018-05-10 Thread David Mertz
This feels specialized enough to belong in a third party library. If that
library can behave as transparently as possible interacting with Python
datetime, so much the better. But the need is niche enough I don't think it
belongs in standard library.

... this as someone who actually worked in a lab that measured MD
simulations in attoseconds. I do understand the purpose.

On Thu, May 10, 2018, 2:00 PM Ed Page  wrote:

> Greetings,
>
> Is there interest in a PEP for extending time, datetime / timedelta for
> arbitrary or extended precision fractional seconds?
>
> My company designs and manufactures scientific hardware that typically
> operate with nanoseconds -- sometimes even attoseconds -- levels of
> precision.  We’re in the process of providing Python APIs for some of these
> products and need  to expose the full accuracy of the data to our
> customers.  Doing so would allow developers to do things like timestamp
> analog measurements for correlating with other events in their system, or
> precisely schedule a future time event for correctly interoperating  with
> other high-speed devices.
>
> The API we’ve been toying with is adding two new fields to time, datetime
> and timedelta
> - frac_seconds (int)
> - frac_seconds_exponent (int or new SITimeUnit enum)
>
> time.microseconds would be turned into a property that wraps frac_seconds
> for compatibility
>
> Challenges
> - Defining the new `max` or `resolution`
> - strftime / strptime.  I propose that we do nothing, just leave
> formatting / parsing to use `microseconds` at best.  On the other hand,
> __str__ could just specify the fractional seconds using scientific or
> engineering notation.
>
> Alternatives
> - My company create our own datetime library
>   - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum,
> delorean, datetime64, pandas.Timestamp – all of which offer varying degrees
> of compatibility)
> - Add an `attosecond` field and have `microsecond` wrap this.
>   - Effectively same except hard code `frac_seconds_exponent` to lowest
> value
>   - The most common cases (milliseconds, microseconds) will always pay the
> cost of using a bigint as compared to the proposal which is a "pay for what
> you use" approach
>   - How do we define what is "good enough" precision?
> - Continue to subdivide time by adding `nanosecond` that is "nanoseconds
> since last micosecond", `picosecond` that is "picoseconds since last
> micnanosecond", and  `attosecond` field that is "attoseconds since last
> picosecond"
>   - Possibly surprising API; people might expect `picosecond` to be an
> offset since last second
>   - Messy base 10 / base 2 conversions
> - Have `frac_seconds` be a float
>   - This has precision issues.
>
> If anyone wants to have an impromptu BoF on the subject, I'm available at
> PyCon.
>
> Thanks
> Ed Page
> ___
> Python-ideas mailing list
> Python-ideas@python.org
> https://mail.python.org/mailman/listinfo/python-ideas
> Code of Conduct: http://python.org/psf/codeofconduct/
>
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-ideas] High Precision datetime

2018-05-10 Thread Ed Page
Greetings,
 
Is there interest in a PEP for extending time, datetime / timedelta for 
arbitrary or extended precision fractional seconds?
 
My company designs and manufactures scientific hardware that typically operate 
with nanoseconds -- sometimes even attoseconds -- levels of precision.  We’re 
in the process of providing Python APIs for some of these products and need  to 
expose the full accuracy of the data to our customers.  Doing so would allow 
developers to do things like timestamp analog measurements for correlating with 
other events in their system, or precisely schedule a future time event for 
correctly interoperating  with other high-speed devices. 
 
The API we’ve been toying with is adding two new fields to time, datetime and 
timedelta
- frac_seconds (int)
- frac_seconds_exponent (int or new SITimeUnit enum)
 
time.microseconds would be turned into a property that wraps frac_seconds for 
compatibility
 
Challenges
- Defining the new `max` or `resolution`
- strftime / strptime.  I propose that we do nothing, just leave formatting / 
parsing to use `microseconds` at best.  On the other hand, __str__ could just 
specify the fractional seconds using scientific or engineering notation.
 
Alternatives
- My company create our own datetime library
  - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, 
delorean, datetime64, pandas.Timestamp – all of which offer varying degrees of 
compatibility)
- Add an `attosecond` field and have `microsecond` wrap this.
  - Effectively same except hard code `frac_seconds_exponent` to lowest value
  - The most common cases (milliseconds, microseconds) will always pay the cost 
of using a bigint as compared to the proposal which is a "pay for what you use" 
approach
  - How do we define what is "good enough" precision?
- Continue to subdivide time by adding `nanosecond` that is "nanoseconds since 
last micosecond", `picosecond` that is "picoseconds since last micnanosecond", 
and  `attosecond` field that is "attoseconds since last picosecond"
  - Possibly surprising API; people might expect `picosecond` to be an offset 
since last second
  - Messy base 10 / base 2 conversions
- Have `frac_seconds` be a float
  - This has precision issues.
 
If anyone wants to have an impromptu BoF on the subject, I'm available at PyCon.

Thanks
Ed Page
___
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/