Hi Tom,
On 10/07/2016 06:02 PM, Tom Van Baak wrote:
To expand on the replies by Bob and Magnus...
Many years ago after pForth was discovered inside the entire hp 585xx and Z38xx series
"SmartClock" GPSDO, Magnus and I worked on the mystery of how the 58503A GPSDO
worked so well. HP appears to use a 64-entry circular buffer to record hourly EFC
history. Given 64 hours (2.7 days) of data, a GPSDO can make a reasonable prediction of
GPS reception or OCXO frequency stability (via ADEV-like statistics) and frequency drift
(via linear or quadratic LSQ fits).
I found the big binary dump of the flash on Tom's page, and that he
already found a bunch of strings.
I've spent a lot of time to disassemble and decompile the code,
identifying libc routines, decode the pForth, finding variables etc.
It's a large piece of code to decrypt. The decompiler tool has bugs and
crashes, and it turns out that only the older version is stable enough
to do any useful work, and well there is no source-code. Part of the
problem in decrypting the whole thing is to figure out the pSos
routines. These hurdles aside, it's nice to see the code slowly come
clear, figuring out routine for routine, variable after variable...
It would be interesting to complete it some day, but ah well.
Why 64 hours? Well, C programmers working with circular buffers like powers of
2. And from personal experience working with GPSDO I know that high sampling
rates are mostly noise and not useful. So hourly EFC data makes sense to me.
Also from experience I know that less than one day of EFC data can be
misleading. Similarly more than a week of stale past data can also be
irrelevant to a prediction of 1 day into the future. So for all these reasons,
64 hours seems an adequate choice.
Now, any other number would also work, it would not be too much code to
do properly, but whenever power of 2 is achieveable, it is very handy in
a binary system.
For least-square estimation, higher number of samples provide steeper
filters for the estimated parameters. A simple estimation of degrees of
freedom would be number of samples minus estimated parameters plus one.
However, 64 samples should be enough to get a fair idea with fairly good
confidence interval, so it would kind of work good enough, which is the
purpose here anyway.
It would be nice to recreate more of these algorithms.
Cheers,
Magnus
Note HP and other high-end GPSDO provide both a FFOM (frequency figure of
merit) and TFOM (time FOM) value via the SCPI interface. There is lots more
info on all these subjects scattered in the time-nuts archives. Here's an
example 58503 dump (log1348.txt):
p4th D > pr_efc
efc = 280607.843750
p4th D > pll_rep
start ptr = 7 stop_ptr = 6
max loop time = -1412584448
ffom = 0
tfom = 1.0e-06 secs
p4th D > efc_rep
65.698517 282457.3 3
66.698517 282468.8 3
67.698517 282473.8 3
68.698517 282485.2 3
69.698517 282490.1 3
70.698517 282496.9 3
7.698519 280841.3 3
8.698519 280943.2 3
9.698519 281063.8 3
10.698519 281126.8 3
11.698519 281185.4 3
12.698519 281259.0 3
13.698519 281316.7 3
14.698519 281353.4 3
15.698519 281413.1 3
16.698519 281464.9 3
17.698519 281511.9 3
18.698519 281567.6 3
19.698519 281622.8 3
20.698519 281634.8 3
21.698519 281671.7 3
22.698519 281705.8 3
23.698519 281736.4 3
24.698519 281768.2 3
25.698519 281813.6 3
26.698519 281847.9 3
27.698519 281872.4 3
28.698519 281899.0 3
29.698519 281919.0 3
30.698519 281950.0 3
31.698519 281974.3 3
32.698517 282001.1 3
33.698517 282043.5 3
34.698517 282054.2 3
35.698517 282056.2 3
36.698517 282060.2 3
37.698517 282081.5 3
38.698517 282092.2 3
39.698517 282093.2 3
40.698517 282094.1 3
41.698517 282100.7 3
42.698517 282127.8 3
43.698517 282126.1 3
44.698517 282143.3 3
45.698517 282150.0 3
46.698517 282162.9 3
47.698517 282188.4 3
48.698517 282213.4 3
49.698517 282244.7 3
50.698517 282255.4 3
51.698517 282260.3 3
52.698517 282280.5 3
53.698517 282286.6 3
54.698517 282307.0 3
55.698517 282319.3 3
56.698517 282336.2 3
57.698517 282350.4 3
58.698517 282367.3 3
59.698517 282367.2 3
60.698517 282395.8 3
61.698517 282411.4 3
62.698517 282430.3 3
63.698517 282441.6 3
64.698517 282450.1 3
a= 2.793488e+05 b= -2.535462e+00 c= 7.822419e+02
p4th D >
/tvb
----- Original Message -----
From: "Magnus Danielson" <mag...@rubidium.dyndns.org>
To: <time-nuts@febo.com>
Cc: <mag...@rubidium.se>
Sent: Thursday, October 06, 2016 3:40 PM
Subject: Re: [time-nuts] Measure GPSDO stability with minimum resources?
Hi,
On 10/06/2016 08:38 PM, Bob Camp wrote:
Hi
One very simple experiment:
Take a HP that has been off power for a year or so. Fire it up and watch it’s
predictions
of holdover accuracy. Many of them will go through a “zero” time estimate at
one or
two days. At three or four days they are struggling to hit spec (10us). The
reason is
pretty simple. The OCXO warmed up and went through an inflection (reversal in
direction).
They estimated across the inflection, got zero and passed that on ….
Indeed. The Z3801A does a least-square fit and then tries to maintain
that. If done at the wrong time it will be wildly off. I don't remember
the details, but I think I recall that you can trigger the
re-calibration routine which is what you want to do to drive it in the
right direction.
Least-square fitting isn't all that magic and doesn't really require
lots of memory, if you do it properly. You just need the oscillator to
heat up and settle before you attempt to do anything involving long
time-constants. Usually it's not the core algorithms, but the heuristics
that needs to work well.
Cheers,
Magnus
_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.