Re: [time-nuts] seeking a time/clock software architecture

2011-09-24 Thread Magnus Danielson

On 24/09/11 00:13, Jim Lux wrote:

On 9/23/11 2:00 PM, Chris Howard wrote:


Seems like a lot of unknowns. You would have to
have sensors monitoring the sensors.


I think the clock model (insofar as variations in the oscillator) are
outside the scope, as long as the effect of that variation can be
represented cleanly.

For example, with a simple 2 term linear model t = clock/rate + offset,
you can describe the *effect* of a rate, and if the rate changes, the
model changes. As long as you keep track of the rates and offsets you've
used in the past, you can reconstruct what clock was for any t or vice
versa.


Which is more or less what calibration records is about.

Infact, how all these measures are intended to be used to provide a 
corrected measure with uncertainty bounds is not very well covered in 
the papers. It's scattered around.


As for the model at hand... the optimum 2-term model and its update log 
might not provide best performance with parameteres directly 
interchangeable with the optimum 3-term model. That being said, meaning 
that the phase error, frequency and drift does not provide a good source 
for phase error and frequency and vice versa.


You might consider to standardise the models in order to provide the 
quality in interchange of measurements. You might require to have 
support for several models and essentially provide a frame-work standard 
for transporting model data. Interconnecting models might need 
additional tought.


I need to think about that one.


A clock model predictor might use all those factors to better estimate
the rate. Having a high order polynomial model might let you not need to
update the model parameters as often. That's a tradeoff the user could
make: Do I use a 2 or 3 term clock to time transformation, and update it
once a minute, or do I use a 20 term transformation, and update it once
a month.


A 20 term model requires fairly high precision and good rate 
measurements to become meaningfull. Irregular updates as such is not a 
problem, as long as you can induce precission into the system when needed.



OK, so if you wanted an output from your Time API that gave you a
estimated uncertainty of time (think like the accuracy estimates from
GPS receivers), what would that look like?


Estimated parameters:
timeoffset, frequencyoffset, driftoffset

Uncertainty matrix

Just look in the manual of a better GPS and you essentially see the 
Kalman filter model and its parameters popping out.



Do you give a 1 sigma number? What about bounding values? (e.g. the API
returns the time is 12:21:03.6315, standard deviation of 1.5
millisecond, but guaranteed in the range 12:21:03 to 12:21:04)


You do not want bounding values, noise forms makes it hard. one-sigma 
values help. In all this, I keep thinking Kalman filter (or the like).


If you want a standard way of present numbers, you will have to build 
that ontop on standard models.


It would be interesting to see what a combination of mutual 
synchronisation within a constellation and central synchronisation would 
yield. Your constellation would maintain contact with each other and 
pull eachother to some form of average time (according to arbitrary 
time-scale) and then use the earth link to provide long term 
corrections. A good mutual synchronisation strategy would allow the 
constellation to shrink and grow without falling completely appart.


If you provide ranging mechanisms within the constellation path delays 
can naturally be compensated out of time.



I would expect that a fancy implementation might return different
uncertainties for different times in the future (e.g. I might say that I
can schedule something with an accuracy of 1 millisecond in the next 10
minutes, but only within 30 milliseconds when it's 24 hours away)


This is true, but if you need higher certainty at a particular time you 
can schedule a synchronisation event or two where uncertainties can be 
reduced. If you have the Kalman state and state-vector, you can run the 
predictor into the future.



The mechanics of how one might come up with this uncertainty estimate
are out of scope, but the semantics and format of how one reports it are
in scope for the architecture.


I think you will need to look at the clock models being used. It may be 
that the models all belong to Kalman types, but with different model 
sizes... but then someone needs a particle filter model... what if the 
IMU model is used... time and position.


At least a survey over feasable models needs to be done to see what can 
be done.


Cheers,
Magnus

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-24 Thread Magnus Danielson

On 24/09/11 00:21, Jim Lux wrote:

On 9/23/11 2:24 PM, Chris Albertson wrote:

On Fri, Sep 23, 2011 at 12:32 PM, Jim Luxjim...@earthlink.net wrote:

On 9/23/11 10:50 AM, Chris Albertson wrote:



Yes, in the general case, but in the spacecraft case, I think we're more
concerned about smoothness and such over time spans of days, maybe
weeks and
months.

More about establishing time correlation between multiple
radios/spacecraft
in a constellation, for instance.


I think better to have your system be usable in the real world and
then the spacecraft to use real world standards when it can. If it
can handle the ful general case then it will work in the spacecraft
too. So Chinese lunar calenders are a good mental exercise. At any
rate piece wise 2nd order polynomial will work in all cases I can
think of because you can always make the pieces really small if need
be to the point where it becomes a table look up.


hmm.. 2nd order for time, or 2nd order for rate (3rd order for time). I
keep thinking it would be nice to have the derivative of rate be
continuous (although I confess I can't think of anything beyond gut feel
for that). Maybe for all the common cases that's sufficient for a
predict into the future for a reasonable time






Spacecraft spend a fair amount of time on the ground in testing.
People swap out parts. I work in telemetry and you should see the
database of tens of thousands of polynomial functions that must be
used to process data. from say a DeltaIV. It's not only clocks but
dozens of sensors that get changed out in the months preceding launch.



Yes.. And there's no standard form that I've been able to discern for
how those polynomials are specified. It's
vehicle/spacecraft/instrument/software tool specific.

So if you're writing a program to handle it automatically, you need to
code up something special each time. These days, we get telemetry
calibration in forms like .pdf files generated from a word document,
plots from Matlab, Excel spreadsheets in some unique form, various and
sundry import/export files from whatever program they're using to
process telemetry, and so forth. There was an effort a few years back to
try and standardize mission data systems but I don't know that it ever
really worked. The cost to write those custom ingest routines is small
in the context of a $150M mission every couple or three years.

(maybe there is a standard for this.. I know there is a IEEE standard
for sensor calibration data.. I should take a look at it again.)


Once you pulled data out of telemetry or whatever, putting it in XML 
form according to a DTD fitting your needs should not be too hard, and 
further processing should not take too much effort to extract data.


Essentially what RINEX does, but without the XML wrapper.

Cheers,
Magnus

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-24 Thread Jim Lux

On 9/24/11 1:58 AM, Magnus Danielson wrote:



It would be interesting to see what a combination of mutual
synchronisation within a constellation and central synchronisation would
yield. Your constellation would maintain contact with each other and
pull eachother to some form of average time (according to arbitrary
time-scale) and then use the earth link to provide long term
corrections. A good mutual synchronisation strategy would allow the
constellation to shrink and grow without falling completely appart.

If you provide ranging mechanisms within the constellation path delays
can naturally be compensated out of time.



Precisely so.  I figure the whole synchronization/syntonization of an 
ensemble of clocks of varying quality with aperiod updates has probably 
been addressed in the literature in some way.






I would expect that a fancy implementation might return different
uncertainties for different times in the future (e.g. I might say that I
can schedule something with an accuracy of 1 millisecond in the next 10
minutes, but only within 30 milliseconds when it's 24 hours away)


This is true, but if you need higher certainty at a particular time you
can schedule a synchronisation event or two where uncertainties can be
reduced. If you have the Kalman state and state-vector, you can run the
predictor into the future.


That is what I was thinking.





___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-24 Thread Magnus Danielson

On 24/09/11 15:12, Jim Lux wrote:

On 9/24/11 1:58 AM, Magnus Danielson wrote:



It would be interesting to see what a combination of mutual
synchronisation within a constellation and central synchronisation would
yield. Your constellation would maintain contact with each other and
pull eachother to some form of average time (according to arbitrary
time-scale) and then use the earth link to provide long term
corrections. A good mutual synchronisation strategy would allow the
constellation to shrink and grow without falling completely appart.

If you provide ranging mechanisms within the constellation path delays
can naturally be compensated out of time.



Precisely so. I figure the whole synchronization/syntonization of an
ensemble of clocks of varying quality with aperiod updates has probably
been addressed in the literature in some way.


It has. I can dig up some references for you.

The trivial essentially pulls the clocks towards each other. The more 
complex will include weighing. This problem has been beaten to death.


NIST has made a line of publications on their A1 atomic scale and 
algorithms for it, including code actually.


Mutual synchronisation has also been investigated in telecom situations. 
Essentially, if two clocks steer against each other they will lock 
half-wise. Common mode changes will still affect them but diffrential 
mode changes (including noise) will become somewhat reduced.





I would expect that a fancy implementation might return different
uncertainties for different times in the future (e.g. I might say that I
can schedule something with an accuracy of 1 millisecond in the next 10
minutes, but only within 30 milliseconds when it's 24 hours away)


This is true, but if you need higher certainty at a particular time you
can schedule a synchronisation event or two where uncertainties can be
reduced. If you have the Kalman state and state-vector, you can run the
predictor into the future.


That is what I was thinking.


That would be fairly trivial.

We should discuss filter models then.

Cheers,
Magnus

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Poul-Henning Kamp
In message 4e7c9fa6.1000...@earthlink.net, Jim Lux writes:


The standard currently defines a time API with some simple features to 
set and get time, nominally defined in terms of a transformation from 
some base clock (i.e. there's a default transformation of the form 
reported time = k1 * raw clock + k2).  In the current standard, time 
is carried around as 32 bit seconds + 32 bit nanoseconds (which is, at 
least familiar to most people, being similar to POSIX seconds + 
microseconds from gettimeofday())

Take a look at FreeBSD's timecounters, what you are asking for
sounds pretty much like what I did 15 years:

http://phk.freebsd.dk/pubs/timecounter.pdf

I used a 32.64 internal format, to avoid rounding errors, particularly
in your k1 term.

I'm headed for the US east-coast the next week, if we get anywhere
near each other, I'd be happy to talk, shoot me an email: p...@freebsd.org

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 8:35 AM, Poul-Henning Kamp wrote:


Take a look at FreeBSD's timecounters, what you are asking for
sounds pretty much like what I did 15 years:

http://phk.freebsd.dk/pubs/timecounter.pdf

I used a 32.64 internal format, to avoid rounding errors, particularly
in your k1 term.

I'm headed for the US east-coast the next week, if we get anywhere
near each other, I'd be happy to talk, shoot me an email: p...@freebsd.org



AH yes... I forgot to mention that phk's timecounter stuff has already 
been incorporated in our implementation (thanks Poul-Henning!) (and we 
truncate to match the API requirement of uint32 for the nanoseconds) 
For what it's worth, my implementation is using RTEMS as the base OS, 
but at this stage, I'm trying to define an architecture standard for 
others to use, with selected implementation standards as well (e.g. API 
or message formats)


It's all the more complex stuff.. how does one represent a more 
sophisticated transformation?  How does one represent changes in the 
transformation (either in a log file, or in a schedule for the future) 
so that one can reconstruct a time in the past, for instance.


There's plenty of standards for how to represent time (in the space 
biz, we use CCSDS unsegmented time a lot), but the more abstract time 
management is sort of left up in the air.  For instance, there's a 
standard/recommendation that says something along the lines of consider 
the time of the first bit in the message as the tone in the at the 
tone, the time is concept.  And plenty of descriptions of various time 
scales (TAI, UTC, UT1, etc.)


What I'd like to do is take the next step beyond what you promulgated 
with a representation of time and the conversion between count and time 
with a linear equation.


I'd like to propose a standard description of a higher order model of 
time and the transformation between raw clock and time (in some agreed 
upon time scale).


And I'd like to describe an architecture for manipulating this. e.g. 
when you set the time, at a simple level it means measuring the 
difference between what you have now and what you want and adjusting 
your transformation to match.



___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Poul-Henning Kamp
In message 4e7cbca1.9010...@earthlink.net, Jim Lux writes:

What I'd like to do is take the next step beyond what you promulgated 
with a representation of time and the conversion between count and time 
with a linear equation.

I'd like to propose a standard description of a higher order model of 
time and the transformation between raw clock and time (in some agreed 
upon time scale).

Ouch...

That's one tough nut to generalize...

Are you even sure it makes sense to generalize it ?

3.  The only thing worse than generalizing from one example
is generalizing from no examples at all.

(From Gettys rules for X11)


-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 10:13 AM, Poul-Henning Kamp wrote:

In message4e7cbca1.9010...@earthlink.net, Jim Lux writes:


What I'd like to do is take the next step beyond what you promulgated
with a representation of time and the conversion between count and time
with a linear equation.

I'd like to propose a standard description of a higher order model of
time and the transformation between raw clock and time (in some agreed
upon time scale).


Ouch...

That's one tough nut to generalize...

Are you even sure it makes sense to generalize it ?

3.  The only thing worse than generalizing from one example
is generalizing from no examples at all.

(From Gettys rules for X11)




Well, that *is* why I asked the assembled multitude... you might be 
right, but I'd hate to say it's not worth it and then have someone pop 
up and say but why don't we use XYZ standard  And, if we don't want to 
standardize, it's always nice to explicitly say we are not specifying 
this deliberately and ANY implementation conforms to the standard 
(which means for interoperability, you can't assume that the other side 
is doing it a particular way, so you'd have to explicitly define an 
interface description).




One aspect of why at least a standardized second order model would be 
nice is that it allows you to make smooth non-discontinuous changes in 
rate.   the transformation from count to time would be discontinuous in 
rate of rate (i.e. it would go from zero, to something, to zero), but 
continuous in terms of rate.


Even just promulgating a standard way of changing the transformation 
might be useful. For instance, That it occurs at a time defined in terms 
of the old transformation,and at that time, we use the new 
transformation.  (this is like the daylight saving time sort of thing. 
At 2AM old time, it is instantly 1 AM new time)


___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 10:25 AM, Jim Lux wrote:



One aspect of why at least a standardized second order model would be
nice is that it allows you to make smooth non-discontinuous changes in
rate. the transformation from count to time would be discontinuous in
rate of rate (i.e. it would go from zero, to something, to zero), but
continuous in terms of rate.



The other thing that crops up all the time in the spacecraft world is 
that you're always taking into account the light time propagation delay 
between two ends of a link, which is varying fairly quickly.  Since the 
problem of doing something like I want my signal to arrive at a 
particular time over there crops up a lot, as does the general I want 
to measure my oscillator against the one on that other spacecraft, 
something that provides a consistent computational framework (as opposed 
to specifically designed for the application) might be useful.


For instance, GPS receivers have to do this calculation already, so the 
whole range/range rate estimation process is built in, in order to do 
the nav solution.  Each implementation probably does it a different way, 
but at least the observables are reported in a standard way as RINEX 
(tailored to the needs of GPS, e.g carrier phase is measured in cycles)


___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Chris Albertson
I'd like to propose a standard description of a higher order model of
time and the transformation between raw clock and time (in some agreed
upon time scale).


A good time transform will let you transform between time scales at
points in the far future and far past.   For example what was the
date on the Chinese calendar for Jan 11th in 1,500BC  My point is
that you may want to apply your transform on times not near the
present.

Two timescales can have different phase and rate.   At any instant in
time two real numbers are enough to transform the time from one system
to another.  A linear equation is enough but the rate might change
over time. I think this means a second order polynomial.

Next I think you must always define the range where the polynomial is
valid.  For example adding a leap second to one time scale invalidates
the polynomial and makes you use another one

So a general purpose API would need to look at the epoch to be
transformed then select the correct polynomial.

This amounts really to a table look up.  But you need that to handle
things like conversion from UTC to a computer's internal time.  A
computer's time can depend in silly things like the air conditioner in
the room cycling

Chris Albertson
Redondo Beach, California

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 10:50 AM, Chris Albertson wrote:

I'd like to propose a standard description of a higher order model of
time and the transformation between raw clock and time (in some agreed
upon time scale).



A good time transform will let you transform between time scales at
points in the far future and far past.   For example what was the
date on the Chinese calendar for Jan 11th in 1,500BC  My point is
that you may want to apply your transform on times not near the
present.



Yes, in the general case, but in the spacecraft case, I think we're more 
concerned about smoothness and such over time spans of days, maybe weeks 
and months.


More about establishing time correlation between multiple 
radios/spacecraft in a constellation, for instance.






Two timescales can have different phase and rate.   At any instant in
time two real numbers are enough to transform the time from one system
to another.  A linear equation is enough but the rate might change
over time. I think this means a second order polynomial.


ALmost certainly, if you want rate to be continuous.



Next I think you must always define the range where the polynomial is
valid.  For example adding a leap second to one time scale invalidates
the polynomial and makes you use another one


So, how would one define that range, I'm thinking that it has to be in 
terms of the output of the transformation (i.e. in the target timescale).





So a general purpose API would need to look at the epoch to be
transformed then select the correct polynomial.


Exactly



This amounts really to a table look up.  But you need that to handle
things like conversion from UTC to a computer's internal time.  A
computer's time can depend in silly things like the air conditioner in
the room cycling



Exactly.. or the slow and majestic movement of the heavenly bodies. For 
instance, things in low earth orbit have fluctuations in temperature 
every revolution (say, around 90-100 minutes) on top of roughly weekly 
cycle (depending on the inclination) on top of an annual cycle.  One 
doesn't necessarily need to model such a thing directly, but whatever 
scheme there is should accommodate this kind of change smoothly.



Actually, the really annoying one is where I have a good clock that's 
stable, but I need to keep adjusting time to match someone else's 
terrible clock.  Most clock disciplining/time propagation models assume 
your bad clock is following a better clock.



___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Poul-Henning Kamp
In message 4e7cdeb0.8070...@earthlink.net, Jim Lux writes:

Actually, the really annoying one is where I have a good clock that's 
stable, but I need to keep adjusting time to match someone else's 
terrible clock.  Most clock disciplining/time propagation models assume 
your bad clock is following a better clock.

That is exactly what happens when you put an OCXO or Rb in a computer
and run NTPD against a server across the internet :-)


I still have a hard time drawing a boundary about this next level up,
and maybe I'm misunderstanding you, so let me think out loud for
a moment:


Its pretty obvious that you can build a suitably general mathematical
model that will cover anything you can expect to encounter:

A polynomium of a dozen degrees will catch any PLL-like regulation
pretty well, add a fourier series for periodic terms like your
temperature variations and finally chop it into segments to
correctly deal with discontinuities from power failuers or
upsets.

But isn't that so general that it becomes meaningless ?

Determining two or three dozen Finagle constants doesn't sound like
anything close to real-time to me, and it all hinges crucially
on the mathematical model being expressive enough.

Something like the SOHO unthaw would be a really tough
challenge to model I think.

The opposite approach is accept that clock-modelling is not the
standardized operation, but representing the data to feed into the
clock-modelling software should be a standard format, to facilitet
model reuse.

Some of that data is pretty obvious:
Time series of clock offset estimates:
When
Which other clock
Uncertainty of other clock
Measured Offset
Uncertainty of Measured Offset
Craft orbital params
XYZT, model gets to figure out what is nearby ?
or
Parametric model (in orbit about, ascending node...)

And then it gets nasty:
Vehicle Thermal balance model
a function of:
Vehicle configuation
Vehicle orientation
Nearby objects (sun, planets, moon)
Wavelength

Clock model:
a function of:
vehicle temperature,
bus voltage
gravity
magnetic fields from craft
vibration (micrometeorites, think: Hipparcos)
clock age
random clock internal events

And the list probably goes on and on, until we come to individual
component failure effects.

Missing in this picture is the organizational boundaries:
The mission data comes from one place, and the clock model
or clock parameters are probably delivered by the manufacturer
of the specific device?

How many of these parameters you need to include will of course
depend on the exact vehicle and mission requirements.  There is a
heck of a difference between a commercial geo-stationary comms
satelite and Gravity Probe B and Gaia.

One can always say put it in XML and hope for the best but
that's not much of a standard, is it ?


-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Chris Howard


Seems like a lot of unknowns.  You would have to
have sensors monitoring the sensors.

Do you lose too much by just maintaining a lifetime worst-case number, or
maybe some kind of probability function?



On 9/23/2011 3:45 PM, Poul-Henning Kamp wrote:


And then it gets nasty:
Vehicle Thermal balance model
a function of:
Vehicle configuation
Vehicle orientation
Nearby objects (sun, planets, moon)
Wavelength

Clock model:
a function of:
vehicle temperature,
bus voltage
gravity
magnetic fields from craft
vibration (micrometeorites, think: Hipparcos)
clock age
random clock internal events





___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Chris Albertson
On Fri, Sep 23, 2011 at 12:32 PM, Jim Lux jim...@earthlink.net wrote:
 On 9/23/11 10:50 AM, Chris Albertson wrote:

 Yes, in the general case, but in the spacecraft case, I think we're more
 concerned about smoothness and such over time spans of days, maybe weeks and
 months.

 More about establishing time correlation between multiple radios/spacecraft
 in a constellation, for instance.

I think better to have your system be usable in the real world and
then the spacecraft to use real world standards when it can.   If it
can handle the ful general case then it will work in the spacecraft
too.   So Chinese lunar calenders are a good mental exercise.  At any
rate piece wise 2nd order polynomial will work in all cases I can
think of because you can always make the pieces really small if need
be to the point where it becomes a table look up.

Spacecraft spend a fair amount of time on the ground in testing.
People swap out parts. I work in telemetry and you should see the
database of tens of thousands of polynomial functions that must be
used to process data. from say a DeltaIV.It's not only clocks but
dozens of sensors that get changed out in the months preceding launch.



Chris Albertson
Redondo Beach, California

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 1:45 PM, Poul-Henning Kamp wrote:

In message4e7cdeb0.8070...@earthlink.net, Jim Lux writes:


Actually, the really annoying one is where I have a good clock that's
stable, but I need to keep adjusting time to match someone else's
terrible clock.  Most clock disciplining/time propagation models assume
your bad clock is following a better clock.


That is exactly what happens when you put an OCXO or Rb in a computer
and run NTPD against a server across the internet :-)


I still have a hard time drawing a boundary about this next level up,
and maybe I'm misunderstanding you, so let me think out loud for
a moment:


Its pretty obvious that you can build a suitably general mathematical
model that will cover anything you can expect to encounter:

A polynomium of a dozen degrees will catch any PLL-like regulation
pretty well, add a fourier series for periodic terms like your
temperature variations and finally chop it into segments to
correctly deal with discontinuities from power failuers or
upsets.

But isn't that so general that it becomes meaningless ?



Maybe, but not necessarily, and if you were to establish such a general 
form for converting timecount (clock) into time what would be a 
reasonable number of terms to limit it to?


Maybe I can find my way through by considering the discontinuity 
problem.  At some level, one likes to have time be continuous (i.e. 
some order derivative = 0).  You'd also like to be able to compare two 
sets of data (derived from different clocks, but converted to a common 
time scale), so the clock to time transformation should make that 
possible at some level of granularity and continuity.


Likewise, you'd like to be able to schedule an event to occur at two 
places (with different underlying clocks)  at some time in the future, 
so the transformation from time to clock value when X needs to happen 
should be possible.  Again, discontinuities would raise problems  (the 
daylight saving time problem of having two 145AMs or no 230AM)


So, it's not necessarily that one needs an arbitrary number of 
polynomial terms, but maybe a way to seamlessly blend segments with 
smaller numbers of terms (the cubic spline idea), and then some 
consistent method for describing it.







Determining two or three dozen Finagle constants doesn't sound like
anything close to real-time to me, and it all hinges crucially
on the mathematical model being expressive enough.


Exactly.. I think the uncertainty in those high order terms might be 
meaningless.


But maybe one could think in terms of a hierarchical scheme..

A high level measurer/predictor that cranks out the current low order 
polynomial model based on whatever it decides to use (e.g. temperature, 
phase of the moon, rainfall)


The scope of the time problem is in defining how one converts from raw 
clock (counts of an oscillator) to time (with a scale and epoch), but 
not how one might come up with the parameters for that conversion. 
(that's in the clock modeling domain)


Likewise.. a synchronization scheme (e.g. NTP) is really an estimation 
problem, based on measurements and observations, and producing the 
transformation.  The mechanics of how one comes up with the parameters 
is out of scope for the architecture, just that such a function can exist.







Something like the SOHO unthaw would be a really tough
challenge to model I think.

The opposite approach is accept that clock-modelling is not the
standardized operation, but representing the data to feed into the
clock-modelling software should be a standard format, to facilitet
model reuse.


Exactly.  The data feeding into the clock modeling process should be 
raw clock and time (e.g. if you get time hacks from an outside source, 
to match them against your clock, you either need to convert clock into 
the external time scale, or convert the external time scale into your 
internal clock scale).


And (as you indicated below) a whole raft of other speculative inputs to 
the clock modeling (out of scope for the architecture..)


The output would be some revised description of how to convert clock 
into time






Some of that data is pretty obvious:
Time series of clock offset estimates:
When
Which other clock
Uncertainty of other clock
Measured Offset
Uncertainty of Measured Offset
Craft orbital params
XYZT, model gets to figure out what is nearby ?
or
Parametric model (in orbit about, ascending node...)

And then it gets nasty:
Vehicle Thermal balance model
a function of:
Vehicle configuation
Vehicle orientation
Nearby objects (sun, planets, moon)
Wavelength

Clock model:
a function of:
vehicle temperature,
bus voltage
gravity
magnetic 

Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 2:00 PM, Chris Howard wrote:


Seems like a lot of unknowns. You would have to
have sensors monitoring the sensors.


I think the clock model (insofar as variations in the oscillator) are 
outside the scope, as long as the effect of that variation can be 
represented cleanly.


For example, with a simple 2 term linear model t = clock/rate + offset, 
you can describe the *effect* of a rate, and if the rate changes, the 
model changes.  As long as you keep track of the rates and offsets 
you've used in the past, you can reconstruct what clock was for any t 
or vice versa.



A clock model predictor might use all those factors to better estimate 
the rate.  Having a high order polynomial model might let you not need 
to update the model parameters as often.  That's a tradeoff the user 
could make: Do I use a 2 or 3 term clock to time transformation, and 
update it once a minute, or do I use a 20 term transformation, and 
update it once a month.








Do you lose too much by just maintaining a lifetime worst-case number, or
maybe some kind of probability function?



Certainly one cannot do a worst-case number.  Consider that you have two 
endpoints that need to be synchronized within 1 millisecond.  This 
requires that the clocks at each end have known rate/offset to an 
accuracy of around 1ppm for 1000 second time span.  Assuming that you 
have some magic means to measure this, you'd like to have a standard way 
to describe the rate and offset (so that you don't have as many formats 
as you do endpoints).



OK, so if you wanted an output from your Time API that gave you a 
estimated uncertainty of time (think like the accuracy estimates from 
GPS receivers), what would that look like?


Do you give a 1 sigma number?  What about bounding values?  (e.g. the 
API returns the time is 12:21:03.6315, standard deviation of 1.5 
millisecond, but guaranteed in the range 12:21:03 to 12:21:04)


I would expect that a fancy implementation might return different 
uncertainties for different times in the future (e.g. I might say that I 
can schedule something with an accuracy of 1 millisecond in the next 10 
minutes, but only within 30 milliseconds when it's 24 hours away)


The mechanics of how one might come up with this uncertainty estimate 
are out of scope, but the semantics and format of how one reports it are 
in scope for the architecture.


___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 2:24 PM, Chris Albertson wrote:

On Fri, Sep 23, 2011 at 12:32 PM, Jim Luxjim...@earthlink.net  wrote:

On 9/23/11 10:50 AM, Chris Albertson wrote:



Yes, in the general case, but in the spacecraft case, I think we're more
concerned about smoothness and such over time spans of days, maybe weeks and
months.

More about establishing time correlation between multiple radios/spacecraft
in a constellation, for instance.


I think better to have your system be usable in the real world and
then the spacecraft to use real world standards when it can.   If it
can handle the ful general case then it will work in the spacecraft
too.   So Chinese lunar calenders are a good mental exercise.  At any
rate piece wise 2nd order polynomial will work in all cases I can
think of because you can always make the pieces really small if need
be to the point where it becomes a table look up.


hmm.. 2nd order for time, or 2nd order for rate (3rd order for time). 
I keep thinking it would be nice to have the derivative of rate be 
continuous (although I confess I can't think of anything beyond gut feel 
for that).  Maybe for all the common cases that's sufficient for a 
predict into the future for a reasonable time







Spacecraft spend a fair amount of time on the ground in testing.
People swap out parts. I work in telemetry and you should see the
database of tens of thousands of polynomial functions that must be
used to process data. from say a DeltaIV.It's not only clocks but
dozens of sensors that get changed out in the months preceding launch.



Yes.. And there's no standard form that I've been able to discern for 
how those polynomials are specified.  It's 
vehicle/spacecraft/instrument/software tool specific.


So if you're writing a program to handle it automatically, you need to 
code up something special each time.  These days, we get telemetry 
calibration in forms like .pdf files generated from a word document, 
plots from Matlab, Excel spreadsheets in some unique form, various and 
sundry import/export files from whatever program they're using to 
process telemetry, and so forth.  There was an effort a few years back 
to try and standardize mission data systems but I don't know that it 
ever really worked.  The cost to write those custom ingest routines is 
small in the context of a $150M mission every couple or three years.


(maybe there is a standard for this.. I know there is a IEEE standard 
for sensor calibration data.. I should take a look at it again.)




___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Chris Albertson
 Yes.. And there's no standard form that I've been able to discern for how
 those polynomials are specified.  It's
 vehicle/spacecraft/instrument/software tool specific.

 So if you're writing a program to handle it automatically, you need to code
 up something special each time.  These days, we get telemetry calibration in
 forms like .pdf files generated from a word document, plots from Matlab,
 Excel spreadsheets in some unique form, various and sundry import/export
 files from whatever program they're using to process telemetry, and so


There is a standard for this but it's horrible and no one implements
it completely or perfectly.
http://www.wsmr.army.mil/RCCsite/Documents/124-11_TMATS%20Handbook/124-11%20TMATS%20Handbook.pdf

Some of us have a better system than passing spread sheets or PDFs.
I'm seeing consolidation to an type of XML based system.

Chris Albertson
Redondo Beach, California

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Poul-Henning Kamp
In message 4e7d0353.2040...@earthlink.net, Jim Lux writes:

But as we move towards constellations of spacecraft with LONG light time 
to earth, that whole time correlation process needs to be done 
autonomously.  So the process of converting local count to time in 
some universally agreed scale and back has to be done locally.

Doesn't GR sort of make universally agreed scale a pretty interesting
concept ?

But more importantly, have you done any estimates of the precision/
required input ratio for this ?

I would seriously look into broadcasting a usable time signal to
the constellation of vehicles, to use as common reference, rather
than have each of them attempt dead reckoning of their own clock
to a paper timescale, which quickly runs into sensor input limitations.

By broadcast I don't mean you have to build an antenna tower, there
are plenty of suitable signals out there already.

Presumably they are going to point an antenna back at earth, adding
a small newtonian telescope with a long-IR sensor next to it, should
give you a signal with a interestingly complex but mostly periodic
waveform, which the vehicles in the constellation can use as
conductors baton.  Other candidates are Jupiters moons (always
a favourite), pulsars (Probably needs to big antennae?) GRB's c.

A pox on put it in XML..  As far as I'm concerned that's no better 
than saying put it in an ASCII text file. 

Well, it's easier to deal with newlines in strings in XML, but otherwise
I fully agree :-)

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.

___
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


Re: [time-nuts] seeking a time/clock software architecture

2011-09-23 Thread Jim Lux

On 9/23/11 4:01 PM, Poul-Henning Kamp wrote:

In message4e7d0353.2040...@earthlink.net, Jim Lux writes:


But as we move towards constellations of spacecraft with LONG light time
to earth, that whole time correlation process needs to be done
autonomously.  So the process of converting local count to time in
some universally agreed scale and back has to be done locally.


Doesn't GR sort of make universally agreed scale a pretty interesting
concept ?

But more importantly, have you done any estimates of the precision/
required input ratio for this ?


It would be nice to be able to synchronize events between spacecraft to 
a few milliseconds over a time span of a day, without needing a special 
time signal during that time.  Existing clocks, with very simple clock 
models can get this level of precision without too much trouble.  The 
trick is smoothly adjusting when we DO get a fix.


Within a given spacecraft, where there's no explicit need for tight time 
sync because of the instrument (e.g. if you're building an 
interferometer, some casual time scheme probably isn't going to get you 
there), microseconds over a time scale of seconds is probably good 
enough. (i.e. distributing a 1pps to everybody).. this is comparable to 
conventional laboratory practice with IRIG time code or 1pps.






I would seriously look into broadcasting a usable time signal to
the constellation of vehicles, to use as common reference, rather
than have each of them attempt dead reckoning of their own clock
to a paper timescale, which quickly runs into sensor input limitations.


Yeah but that's the way we've been doing it for the last 50 years, so 
everyone is familiar with it.  This newfangled thing of navigation 
constellations and broadcasting time references is just hard when you've 
got dedicated stovepipes to each spacecraft, each with their own message 
formats and time scales. Interoperate? Why should I spend my precious 
budget on helping YOU out? Buy your own darned USO in your own budget if 
you need good timing.

grin





By broadcast I don't mean you have to build an antenna tower, there
are plenty of suitable signals out there already.


Actually not. Consider the back side of the moon, or Mars, or Jupiter. 
In earth orbit, lots of sources (GPS, which is actually usable at the 
moon too, after a fashion)





Presumably they are going to point an antenna back at earth, adding
a small newtonian telescope with a long-IR sensor next to it, should
give you a signal with a interestingly complex but mostly periodic
waveform, which the vehicles in the constellation can use as
conductors baton.  Other candidates are Jupiters moons (always
a favourite), pulsars (Probably needs to big antennae?) GRB'sc.


Adding an optical anything is a tough sell (mass, power, complexity, 
alien to people used to RF, etc.).  And in any case, if you're earth 
pointed for your comm link, then the signal from Earth can provide sync 
(it's how we do it now, granted in an ad hoc way).


X-ray Pulsars are the distant future approach (X-NAV) but we're waiting 
for someone to make a suitable sensor.


Jupiter's moons.. I heard a story at work that you can use an iPhone 
camera to see the moons a'la Galileo and hence can do time transfer by 
Newton's methods. (haven't tried it myself.. Jupiter isn't visible 
because of weather (night and morning low clouds and fog, which will be 
familiar to anyone in Southern California))



The general time transfer problem is to have a good clock on an orbiter 
(e.g. a relay orbiter around Mars like MRO or future s/c) and then be 
able to transfer time using that clock to a lander (e.g. a Mars rover) 
over a UHF link.  There's no direct path from Earth to the rover, and, 
in fact, it doesn't have an antenna big enough.  You might only be 
communicating with the orbiter once a week (or every few days).


So you get the time on the orbiter lined up with earth time (TAI, 
typically) and then use the stable clock on the orbiter to transfer that 
time to the orbiter.


A practical application for this is something like Mars Sample Return, 
where you want to launch a rocket with your 500g of precious Martian 
rocks and regolith into Mars orbit, where an orbiter can rendezvous with 
it and transfer the samples to a spacecraft that can send them back to 
Earth.   You need to know where the orbiter is (fairly straight 
forward), where the rocket launch site is (not quite as straight 
forward), and figure out when to push the button for the rocket.  The 
orbiter might not be in view of the rocket at launch time, and neither 
might be in view of earth, so you can't do a straight remote control.. 
it's all done by canned sequence.


Since the orbiter is zooming along at a few km/second, a one second 
difference in launch time is a few km miss distance (which, in truth, 
isn't a huge deal.. we've already got to account for the tens of km 
uncertainty in the rocket's trajectory)


Since mass on Martian surface is very