[time-nuts] A tale of two NTP servers

2021-04-25 Thread djl
A while back, I decided to try putting together a small, gps local NTP 
server. I have several computers on a lan and wanted to serve them all 
with a locally controlled device.  A bit of research  (raspberry pi gps 
ntp server --in my searcher) came up with several instructive 
sites.Seemed easy.  Is there a kit? so my current theory of buy the 
biggest piece possible can be carried out?  Further consult of youtube 
came up with:


https://www.youtube.com/watch?v=VrDRAVy_bg4_channel=JohnMiller

I saw something from John on the time-nuts forum in the past, so 
contacted him re the hat he had for sale. The youtube presentation was 
so clear and complete that I bought one of his hats with the 
accompanying sd card. So I bought in.  I put the hat on a PiB on hand, 
put the antenna outside my window, and inserted the sd card as 
instructed. Before the video was completely finished, the server was 
running and connected to a Linux machine I had tunning nearby. As 
advertised. In detail. No fuss, no muss, and it's still going strong 
after two months and two power outages. WELL DONE JOHN!!!


But what about a ready to run little server? well, there are several 
around. BUT, they're kinda expensive compared to the RPi setups in 
various forms.  I'm sort of a sucker for the cheap (er, inexpensive) 
Chinese modules for various uses available on ebay or alibaba or other 
sources. Apparently, a factory somewhere in Guangdong "designs" and 
makes a bunch of litte modules to do something like measure voltagews, 
generate frequencies, etc. These modules are then bought by the 
relatives of the maker and sold on the outlets for various prices. The 
problem with these modules is that there is NEVER any documentation. 
You're on your own to use all your wiles, including reverse engineering, 
perusal of ham radio sources, eval info from manufacturers of the ripped 
chip designs, etc.  Despite that, for me at least, the modules have been 
cost effective. Always buy at least two.
With that in mind, I hit ebay and came up with: 363361419214 as an 
example. Seemed a bit expensive, but supposed to be plug and play, wit 
antenna and power. So, I ordered one up and hit the documentation trail. 
Nothing!  So, contacted the seller via ebay, and, after a brief hiatus 
for Chinese new year, got what there is for documentaton. Mostly in 
Chinese. But, I was able to find the address of the device using ANGRY 
lan scanner (you do use that, no? a great piece of software!)  
(192.168.0.100 btw).  And that worked too. It's still working. Removing 
it's pants revealed a UBLOX and a 32 bit processor, with some glue 
chips. real simple.


One further note, I'm using one antenna for the two devices, courtesy of 
SV1AFN (https://www.sv1afn.com/en/gnss-gps/-7.html) who makes a bunch of 
good stuff.


So, it can be done. As for me, I would get the PI hat and go that way, 
simply because it is not a black box, and other things are possible 
using that setup. On the other hand, for less cost, the ebay module was, 
once te secrets have been pried out, is totally simple to implement.


I have not tried to compare these units in any way, as there are those 
of you who are far more experienced than I am. All I know is, these 
approaches worked.


As usual, YMMV
73, Don

--

The whole world is a straight man.:
--
Dr. Don Latham  AJ7LL
PO Box 404, Frenchtown, MT, 59834
VOX: 406-626-4304
___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.


[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Andrew Kalman
(Egg on face -- I meant to say exaHz-class processor, not attoHz :-)  )

--Andrew


Andrew E. Kalman, Ph.D.


On Sun, Apr 25, 2021 at 12:02 PM Andrew Kalman  wrote:

> Just a comment on real-time programming.
>
> Obviously the accuracy of the real-time performance is a function of the
> hardware/software system's clock speed -- if you have an attoHz-class
> microprocessor (MCU), most things are going to look real-time to most
> people (except time nuts, of course :-) ) no matter how they are coded ...
>
> In the real world, where clocks are usually in the MHz range, you can
> achieve the real-time performances of your hardware (say, of a 32-bit
> output compare timer peripheral or a 12-bit DAC) as long as the overlaid
> software does not interfere with the peripheral's proper functioning. With
> a "set-and-forget" peripheral that is easy. With one that e.g. needs to be
> "fed" by the overlaid software (say, a DAC that is fed by DMA from data
> coming from outside), that is a bit more difficult. And if some of that
> real-time performance needs to happen in a task, then you really need a
> couple of software architectural features to get you there.
>
> As an example, an MCU-based system that uses queues to manage priority is
> not going to achieve real-time task performance, because e.g. the time to
> enqueue a task will not be constant-time, and it will thereby introduce
> timing jitter into the execution of all tasks. If, OTOH, in this example
> the queuing mechanism were implemented via a hardware timer and ISR that
> handles queueing via a an array (so that the queuing algorithm is
> constant-time), AND that ISR is elevated to the highest priority, then the
> task jitter in that system will be rather minimal, subject only to time
> deltas due to the system serving any other interrupts (regardless of
> interrupt preemption, etc). I.e. if the MCU is serving say a UART interrupt
> when the "main" timing interrupt occurs, either the system has to (HW)
> context-switch out of the UART interrupt to go service the timer ISR (and
> (SW) context-switch), OR it has to wait for the UART ISR to end, before
> servicing the timer interrupt. In the former, jitter will be a function of
> the MCU's interrupt handling; in the latter, it'll be a function of your
> code.
>
> While hardware preemption is always your friend, software preemption (in
> the general case) is not a panacea. For example, in a preemptive RTOS,
> every context switch is going to involve an interrupt. That "extra"
> interrupt will affect the responsiveness and jitter of your other
> interrupts. A cooperative RTOS does not involve any interrupts when
> context-switching, and so (from one of several perspectives), a cooperative
> RTOS (that in theory is not as responsive as a preemptive one) may in fact
> yield certain better real-time performances than a preemptive RTOS.
>
> So, bottom-line rule of thumb: if you want to get minimal timing jitter
> out of an MCU application, it is possible to run an RTOS (of any sort, most
> are soft real-time, very very few are hard real-time) on that MCU, and with
> careful attention to how you architect the system split between hardware
> peripherals and firmware on the MCU, you can get to the exact hardware
> timing specifications of the MCU itself. IMO the use of an RTOS makes a ton
> of sense here, because you can e.g. implement a complete GUI, comms system,
> terminal or other as part of the MCU application, safe in the knowledge
> that all this "RTOS overhead" has _zero_ impact on the real-time
> performance you hoped to get out of the MCU. This does require careful
> coding in certain areas ...
>
> I am a huge proponent of loosely-coupled, priority-based, event-driven MCU
> programming. Assuming it's coded well, it is not incompatible with
> nearly-real-time programming. For high-performance MCU programming, I
> generally follow these rules:
>
>- For 1-to-100-instruction-cycle timing accuracy, use straight-line
>(uninterrupted) instructions (Assembly, or C if your compiler is good)
>while interrupts are disabled. Here, your jitter is your fundamental clock
>jitter as it passes through the MCU.
>- For 100-to-1000-instruction-cycle timing accuracy, code it using an
>interrupt. Here, your jitter is dependent on the interrupt handling of the
>MCU.
>- For greater than 1000-instruction-cycle timing accuracy, hand it
>over to the (properly configured) RTOS. Here, your jitter is dependent on
>how the foreground (interrupt-level) vs. background (task-level) code
>operates in your application.
>
>
> --Andrew
>
> 
> Andrew E. Kalman, Ph.D.
>
>
> On Sun, Apr 25, 2021 at 7:14 AM Lux, Jim  wrote:
>
>> On 4/25/21 6:40 AM, Bob kb8tq wrote:
>> > Hi
>> >
>> >
>> >> On Apr 25, 2021, at 9:31 AM, Lux, Jim  wrote:
>> >>
>> >> On 4/25/21 6:02 AM, Bob kb8tq wrote:
>> >>> Hi
>> >>>
>> >>> The thing that I find useful about a 

[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Andrew Kalman
Just a comment on real-time programming.

Obviously the accuracy of the real-time performance is a function of the
hardware/software system's clock speed -- if you have an attoHz-class
microprocessor (MCU), most things are going to look real-time to most
people (except time nuts, of course :-) ) no matter how they are coded ...

In the real world, where clocks are usually in the MHz range, you can
achieve the real-time performances of your hardware (say, of a 32-bit
output compare timer peripheral or a 12-bit DAC) as long as the overlaid
software does not interfere with the peripheral's proper functioning. With
a "set-and-forget" peripheral that is easy. With one that e.g. needs to be
"fed" by the overlaid software (say, a DAC that is fed by DMA from data
coming from outside), that is a bit more difficult. And if some of that
real-time performance needs to happen in a task, then you really need a
couple of software architectural features to get you there.

As an example, an MCU-based system that uses queues to manage priority is
not going to achieve real-time task performance, because e.g. the time to
enqueue a task will not be constant-time, and it will thereby introduce
timing jitter into the execution of all tasks. If, OTOH, in this example
the queuing mechanism were implemented via a hardware timer and ISR that
handles queueing via a an array (so that the queuing algorithm is
constant-time), AND that ISR is elevated to the highest priority, then the
task jitter in that system will be rather minimal, subject only to time
deltas due to the system serving any other interrupts (regardless of
interrupt preemption, etc). I.e. if the MCU is serving say a UART interrupt
when the "main" timing interrupt occurs, either the system has to (HW)
context-switch out of the UART interrupt to go service the timer ISR (and
(SW) context-switch), OR it has to wait for the UART ISR to end, before
servicing the timer interrupt. In the former, jitter will be a function of
the MCU's interrupt handling; in the latter, it'll be a function of your
code.

While hardware preemption is always your friend, software preemption (in
the general case) is not a panacea. For example, in a preemptive RTOS,
every context switch is going to involve an interrupt. That "extra"
interrupt will affect the responsiveness and jitter of your other
interrupts. A cooperative RTOS does not involve any interrupts when
context-switching, and so (from one of several perspectives), a cooperative
RTOS (that in theory is not as responsive as a preemptive one) may in fact
yield certain better real-time performances than a preemptive RTOS.

So, bottom-line rule of thumb: if you want to get minimal timing jitter out
of an MCU application, it is possible to run an RTOS (of any sort, most are
soft real-time, very very few are hard real-time) on that MCU, and with
careful attention to how you architect the system split between hardware
peripherals and firmware on the MCU, you can get to the exact hardware
timing specifications of the MCU itself. IMO the use of an RTOS makes a ton
of sense here, because you can e.g. implement a complete GUI, comms system,
terminal or other as part of the MCU application, safe in the knowledge
that all this "RTOS overhead" has _zero_ impact on the real-time
performance you hoped to get out of the MCU. This does require careful
coding in certain areas ...

I am a huge proponent of loosely-coupled, priority-based, event-driven MCU
programming. Assuming it's coded well, it is not incompatible with
nearly-real-time programming. For high-performance MCU programming, I
generally follow these rules:

   - For 1-to-100-instruction-cycle timing accuracy, use straight-line
   (uninterrupted) instructions (Assembly, or C if your compiler is good)
   while interrupts are disabled. Here, your jitter is your fundamental clock
   jitter as it passes through the MCU.
   - For 100-to-1000-instruction-cycle timing accuracy, code it using an
   interrupt. Here, your jitter is dependent on the interrupt handling of the
   MCU.
   - For greater than 1000-instruction-cycle timing accuracy, hand it over
   to the (properly configured) RTOS. Here, your jitter is dependent on how
   the foreground (interrupt-level) vs. background (task-level) code operates
   in your application.


--Andrew


Andrew E. Kalman, Ph.D.


On Sun, Apr 25, 2021 at 7:14 AM Lux, Jim  wrote:

> On 4/25/21 6:40 AM, Bob kb8tq wrote:
> > Hi
> >
> >
> >> On Apr 25, 2021, at 9:31 AM, Lux, Jim  wrote:
> >>
> >> On 4/25/21 6:02 AM, Bob kb8tq wrote:
> >>> Hi
> >>>
> >>> The thing that I find useful about a GPS simulator is it’s ability to
> calibrate the
> >>> time delay through a GPS based system. In the case of a GPSDO, there
> may be
> >>> things beyond the simple receiver delay that get into the mix. Getting
> the entire
> >>> offset “picture” all at once is nice thing. Yes, that’s a Time Nutty
> way to look at it…..
> >>>
> >>> So far, I have not seen 

[time-nuts] Re: Small NTP appliance

2021-04-25 Thread Bob kb8tq
Hi

If you are doing this on a micro, handling the network stack is *probably* not
something you will do from scratch. Often bringing in a network stack brings 
in an RTOS with it. You may well get some fancy time code “for free” with 
the RTOS. 

Once upon a time this RTOS / network stack stuff cost you money. That’s 
becoming less and less true as time marches on.  Free makes it a *lot* easier
to drop into a project :) 

Bob

> On Apr 25, 2021, at 3:20 PM, Hal Murray  wrote:
> 
> 
> folk...@vanheusden.com said:
>>> Most modern kernels have a side door used by ntpd to adjust the clock 
>>> frequency.  Typical values are few 10s of PPM and it's easy to measure down 
>>> 0.001 PPM or better.  The NTP world calls that drift.  If you have a PC or 
>>> Raspberry Pi running Linux or *BSD and ntpd you can find the drift.
> 
>> Hmmm, I believe Eamonn is going to use a microcontroller. They usually don't
>> run linux or something like that. 
> 
> You can still use the same trick.  You may have to rewrite the clock code.
> 
> If the clock code does something like
>  time = time +1
> where time is in ms and gets bumped from a 1000Hz interrupt, then you have to 
> change that "1" to be the carry out of a slot with enough precision.  If you 
> want to correct to 1 PPM, that's 1E6.  So the code becomes:
>  calibrate = 100;
>  partial + partial + calibrate;
>  while (partial > 100) {
>time += 1;
>partial -= 100;
>  }
> 
> Now fudge calibrate to make your clock run slower or faster.
> 
> The trick is that you have to scale things such that your equivalent of 
> partial has enough working bits to match the resolution that you want in your 
> correction.
> 
> A million is 20 bits so you are probably using 32 bit arithmetic.  You might 
> as well use a billion.
> 
> If you want time in ms and you are getting 1024 interrupts per second, the 
> code may already do something like the above.
> 
> -- 
> These are my opinions.  I hate spam.
> 
> 
> ___
> time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
> email to time-nuts-le...@lists.febo.com
> To unsubscribe, go to and follow the instructions there.
___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.

[time-nuts] Re: Small NTP appliance

2021-04-25 Thread Hal Murray


folk...@vanheusden.com said:
>> Most modern kernels have a side door used by ntpd to adjust the clock 
>> frequency.  Typical values are few 10s of PPM and it's easy to measure down 
>> 0.001 PPM or better.  The NTP world calls that drift.  If you have a PC or 
>> Raspberry Pi running Linux or *BSD and ntpd you can find the drift.

> Hmmm, I believe Eamonn is going to use a microcontroller. They usually don't
> run linux or something like that. 

You can still use the same trick.  You may have to rewrite the clock code.

If the clock code does something like
  time = time +1
where time is in ms and gets bumped from a 1000Hz interrupt, then you have to 
change that "1" to be the carry out of a slot with enough precision.  If you 
want to correct to 1 PPM, that's 1E6.  So the code becomes:
  calibrate = 100;
  partial + partial + calibrate;
  while (partial > 100) {
time += 1;
partial -= 100;
  }

Now fudge calibrate to make your clock run slower or faster.

The trick is that you have to scale things such that your equivalent of 
partial has enough working bits to match the resolution that you want in your 
correction.

A million is 20 bits so you are probably using 32 bit arithmetic.  You might 
as well use a billion.

If you want time in ms and you are getting 1024 interrupts per second, the 
code may already do something like the above.

-- 
These are my opinions.  I hate spam.


___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.


[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Bob kb8tq
Hi

> On Apr 25, 2021, at 10:13 AM, Lux, Jim  wrote:
> 
> On 4/25/21 6:40 AM, Bob kb8tq wrote:
>> Hi
>> 
>> 
>>> On Apr 25, 2021, at 9:31 AM, Lux, Jim  wrote:
>>> 
>>> On 4/25/21 6:02 AM, Bob kb8tq wrote:
 Hi
 
 The thing that I find useful about a GPS simulator is it’s ability to 
 calibrate the
 time delay through a GPS based system. In the case of a GPSDO, there may be
 things beyond the simple receiver delay that get into the mix. Getting the 
 entire
 offset “picture” all at once is nice thing. Yes, that’s a Time Nutty way 
 to look at it…..
 
 So far, I have not seen anybody extending this sort of calibration to the 
 low cost
 SDR based devices. Without digging into the specific device, I’m not sure 
 how
 well a “generic” calibration would do. Indeed, it might work quite well. 
 Without
 actually doing it … no way to tell.
 
 So if anybody knows of the results of such an effort, I suspect it would 
 be of
 interest to folks here on the list.
 
 Bob
>>> 
>>> A double difference kind of relative measurement might be useful - compare 
>>> two (or three) GNSS receivers.  Then the absolute timing of the test source 
>>> isn't as important.
>> Well …. it is and it isn’t. If you are trying to get “UTC in the basement” 
>> (or even
>> GPS time)  to a couple nanoseconds, then you do need to know absolute delays
>> of a number of things. Is this a bit crazy? Of course it is :)
>> 
>> Bob
>> 
> Good point..
> 
> For many SDRs, it's tricky to get the output synchronized to anything - a lot 
> were designed as an RF ADC/DAC for software SDR (like gnuradio). The software 
>  SDRs are sort of a pipeline of software, with not much attention to absolute 
> timing, just that the samples come out in the same order and rate as samples 
> go in, but with a non-deterministic delay.  Partly a side effect of using 
> things like USB or IP sockets as an interface. And, to a certain extent, 
> running under a non-real time OS (where real time determinism is "difficult 
> programming" - although clearly doable, since playing back a movie requires 
> synchronizing the audio and video streams ).
> 
> If your goal is "write software 802.11" you don't need good timing - the 
> protocol is half duplex in any case, and a millisecond here or there makes no 
> difference.
> 
> A SDR that has a FPGA component to drive the DACs might work pretty well, if 
> you can figure a way to get a sync signal into it.  One tricky thing is 
> getting the chips lined up with the carrier - most inexpensive SDRs use some 
> sort of upconverter from baseband I/Q, and even if the I/Q runs off the same 
> clock as the PLL generating the carrier, getting it synced is hard.
> 
> The best bet might be a "clock the bits out and pick an appropriate harmonic 
> with a bandpass filter".   If the FPGA clock is running at a suitable 
> multiple of 1.023 MHz, maybe this would work?  Some JPL receivers use 38.656 
> MHz as a sample rate, which puts the GPS signal at something like 3/4 of the 
> sample rate.
> I'd have to work it backwards and see if you could generate a harmonic that's 
> at 1575...
> 

I think that coming out of a “one bit DAC” on the FPGA is the most likely way 
to generate
the signal. Run that at carrier / N and away you go. I don’t think you want to 
get N to 
terribly high. Having all those images running around may not be a good idea. 
Something
in the 5 to 10 range seems about right. Is there enough “harmonic” to make it 
work or do
you need some sort of SRD-ish thing to womp it up? ….. My guess is that there 
still will 
be a bunch of little delay ambiguities That pile up on you. 

The bigger issue is: do you want to make this a “lifetime project” ? It’s only 
one of many
things that need attention if you want to get to the final goal. Spend enough 
time on each
little step and this all becomes a “many decades” sort of adventure. 

In this case I’ll let Said worry about the details and just use his gizmo :)

Bob


> 
> ___
> time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
> email to time-nuts-le...@lists.febo.com
> To unsubscribe, go to and follow the instructions there.
___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.

[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Lux, Jim

On 4/25/21 6:40 AM, Bob kb8tq wrote:

Hi



On Apr 25, 2021, at 9:31 AM, Lux, Jim  wrote:

On 4/25/21 6:02 AM, Bob kb8tq wrote:

Hi

The thing that I find useful about a GPS simulator is it’s ability to calibrate 
the
time delay through a GPS based system. In the case of a GPSDO, there may be
things beyond the simple receiver delay that get into the mix. Getting the 
entire
offset “picture” all at once is nice thing. Yes, that’s a Time Nutty way to 
look at it…..

So far, I have not seen anybody extending this sort of calibration to the low 
cost
SDR based devices. Without digging into the specific device, I’m not sure how
well a “generic” calibration would do. Indeed, it might work quite well. Without
actually doing it … no way to tell.

So if anybody knows of the results of such an effort, I suspect it would be of
interest to folks here on the list.

Bob


A double difference kind of relative measurement might be useful - compare two 
(or three) GNSS receivers.  Then the absolute timing of the test source isn't 
as important.

Well …. it is and it isn’t. If you are trying to get “UTC in the basement” (or 
even
GPS time)  to a couple nanoseconds, then you do need to know absolute delays
of a number of things. Is this a bit crazy? Of course it is :)

Bob


Good point..

For many SDRs, it's tricky to get the output synchronized to anything - 
a lot were designed as an RF ADC/DAC for software SDR (like gnuradio). 
The software  SDRs are sort of a pipeline of software, with not much 
attention to absolute timing, just that the samples come out in the same 
order and rate as samples go in, but with a non-deterministic delay.  
Partly a side effect of using things like USB or IP sockets as an 
interface. And, to a certain extent, running under a non-real time OS 
(where real time determinism is "difficult programming" - although 
clearly doable, since playing back a movie requires synchronizing the 
audio and video streams ).


If your goal is "write software 802.11" you don't need good timing - the 
protocol is half duplex in any case, and a millisecond here or there 
makes no difference.


A SDR that has a FPGA component to drive the DACs might work pretty 
well, if you can figure a way to get a sync signal into it.  One tricky 
thing is getting the chips lined up with the carrier - most inexpensive 
SDRs use some sort of upconverter from baseband I/Q, and even if the I/Q 
runs off the same clock as the PLL generating the carrier, getting it 
synced is hard.


The best bet might be a "clock the bits out and pick an appropriate 
harmonic with a bandpass filter".   If the FPGA clock is running at a 
suitable multiple of 1.023 MHz, maybe this would work?  Some JPL 
receivers use 38.656 MHz as a sample rate, which puts the GPS signal at 
something like 3/4 of the sample rate.
I'd have to work it backwards and see if you could generate a harmonic 
that's at 1575...



___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.

[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Bob kb8tq
Hi


> On Apr 25, 2021, at 9:31 AM, Lux, Jim  wrote:
> 
> On 4/25/21 6:02 AM, Bob kb8tq wrote:
>> Hi
>> 
>> The thing that I find useful about a GPS simulator is it’s ability to 
>> calibrate the
>> time delay through a GPS based system. In the case of a GPSDO, there may be
>> things beyond the simple receiver delay that get into the mix. Getting the 
>> entire
>> offset “picture” all at once is nice thing. Yes, that’s a Time Nutty way to 
>> look at it…..
>> 
>> So far, I have not seen anybody extending this sort of calibration to the 
>> low cost
>> SDR based devices. Without digging into the specific device, I’m not sure how
>> well a “generic” calibration would do. Indeed, it might work quite well. 
>> Without
>> actually doing it … no way to tell.
>> 
>> So if anybody knows of the results of such an effort, I suspect it would be 
>> of
>> interest to folks here on the list.
>> 
>> Bob
> 
> 
> A double difference kind of relative measurement might be useful - compare 
> two (or three) GNSS receivers.  Then the absolute timing of the test source 
> isn't as important.

Well …. it is and it isn’t. If you are trying to get “UTC in the basement” (or 
even 
GPS time)  to a couple nanoseconds, then you do need to know absolute delays 
of a number of things. Is this a bit crazy? Of course it is :) 

Bob

> 
> ___
> time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
> email to time-nuts-le...@lists.febo.com
> To unsubscribe, go to and follow the instructions there.
___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.

[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Lux, Jim

On 4/25/21 6:02 AM, Bob kb8tq wrote:

Hi

The thing that I find useful about a GPS simulator is it’s ability to calibrate 
the
time delay through a GPS based system. In the case of a GPSDO, there may be
things beyond the simple receiver delay that get into the mix. Getting the 
entire
offset “picture” all at once is nice thing. Yes, that’s a Time Nutty way to 
look at it…..

So far, I have not seen anybody extending this sort of calibration to the low 
cost
SDR based devices. Without digging into the specific device, I’m not sure how
well a “generic” calibration would do. Indeed, it might work quite well. Without
actually doing it … no way to tell.

So if anybody knows of the results of such an effort, I suspect it would be of
interest to folks here on the list.

Bob



A double difference kind of relative measurement might be useful - 
compare two (or three) GNSS receivers.  Then the absolute timing of the 
test source isn't as important.


___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.

[time-nuts] Re: Small NTP appliance

2021-04-25 Thread folkert
> > I have one general question. I don't believe that an internal crystal in the
> > microcontroller will have the accuracy or precision required to have better
> > than a few milliseconds of accuracy (whereas NTP likes to live in the
> > microsecond realm), though I very well could be wrong on that one.
> 
> Accuracy is not a problem.  You can measure that and correct for it.
> 
> Most modern kernels have a side door used by ntpd to adjust the clock 
> frequency.  Typical values are few 10s of PPM and it's easy to measure down 
> 0.001 PPM or better.  The NTP world calls that drift.  If you have a PC or 
> Raspberry Pi running Linux or *BSD and ntpd you can find the drift.

Hmmm, I believe Eamonn is going to use a microcontroller. They usually
don't run linux or something like that.

In a distant past I tried replacing the clock-crystal of an arduino with
an OXCO. That was not succesful but that was because I realised I would
have to make changes to e.g. the bootloader and such so that things
would still have correct bitrate on the serial pins. Hmmm, maybe if I
just had put code on it that would simply output something over the
serial pins on a known baudrate, I could've checked if it indeed worked
at the electronics-part.
___
time-nuts mailing list -- time-nuts@lists.febo.com -- To unsubscribe send an 
email to time-nuts-le...@lists.febo.com
To unsubscribe, go to and follow the instructions there.


[time-nuts] Re: 20210423: Introduction and Request for Spirent GSS4200 User Manual / Help

2021-04-25 Thread Bob kb8tq
Hi

The thing that I find useful about a GPS simulator is it’s ability to calibrate 
the 
time delay through a GPS based system. In the case of a GPSDO, there may be
things beyond the simple receiver delay that get into the mix. Getting the 
entire
offset “picture” all at once is nice thing. Yes, that’s a Time Nutty way to 
look at it…..

So far, I have not seen anybody extending this sort of calibration to the low 
cost
SDR based devices. Without digging into the specific device, I’m not sure how
well a “generic” calibration would do. Indeed, it might work quite well. Without
actually doing it … no way to tell. 

So if anybody knows of the results of such an effort, I suspect it would be of 
interest to folks here on the list.

Bob

> On Apr 24, 2021, at 8:43 PM, Forrest Christian (List Account) 
>  wrote:
> 
> I have used GPS-SDR-SIM with good results.
> 
> It's an open source tool that will generate the right files to be able to
> generate simulated GPS signals using many of the open source SDR platforms
> including HackRF.  It uses the publically available ephermis files along
> whith desired receiver position data to generate the "RF" output files.
> 
> My experience has been that clocking the sdr with a output from a
> disciplined source results in the 1pps from a typical GPS receiver
> remaining at the same relative phase during the entire playback, for a
> definition of same which was good enough for my purposes.
> 
> Two notes:
> 
> This is gps only,  no other constellations.  Would love someone to write a
> similar tool for other constellations.
> 
> Several platforms are supported, some are dirt cheap.  I used HackRF
> because I already had one.   Not sure about any of the others.
> 
> On Sat, Apr 24, 2021, 11:56 AM Lux, Jim  wrote:
> 
>> On 4/24/21 10:31 AM, Andrew Kalman wrote:
>>> Hi Paul.
>>> 
>>> Yes, I've been on this same journey. After I learned (somewhat unrelated)
>>> that one is supposed to have an FCC license to rebroadcast GNSS signals
>>> (e.g. via a repeater inside a lab, makes eminent sense), I started
>> thinking
>>> more about GNSS simulators and how they might be added to my company's
>>> workflow. So I bid on a couple of units, got them for pennies on the
>>> dollar, and started messing with them in the hope of ending up with an
>>> ATE/rack-type setup that I can build into a nearly automatic test &
>>> validation suite.
>>> 
>>> Let's say I was much more successful with the Spectracom/Orolia GSG-5
>> than
>>> with the Spirent GSS4200 ... In the case of the GSG-5, it's really just a
>>> question of how many options you can afford -- the rest is all there, you
>>> don't need a support contract, it's all easily accessible in the unit
>>> itself, and as long as the Internet exists the GSG-5 will probably keep
>>> working (it gets time, ephemeris and almanac data from servers -- it can
>>> simulate stuff NOW (wth the right options), not just in the past and
>>> future). The GSS4200 is about 10-15 years older, and it shows (in terms
>> of
>>> ease-of-use), along with how Spirent chose to monetize their users /
>>> subscribers. Also, the GSG-5 adds things like interference to the signals
>>> (all for a price, of course). IOW, the newer units (at least, from
>>> Spectracom was XL Microwave is now Orolia) are a whole lot easier to use
>>> ... but they come at a price. It's an interesting business.
>>> 
>>> I will say that the build quality of the Spirent is very good. I have not
>>> opened up the GSG-5, just did a calibration and it was very close.
>>> 
>>> I'm a little bit surprised that there is not an open-source, SDR-based
>> GNSS
>>> simulator (at least, one I could find).
>> 
>> 
>> 
>> Not much demand, I suspect.  I seem to recall a GNSS generator that was
>> open source about 5-10 years ago, but I can't find it now.
>> 
>> The record/playback boxes are actually pretty simple - just a single bit
>> in many cases. After all, a lot of the receivers use a single bit input,
>> because the signal of interest is below the thermal noise floor.
>> 
>> The real challenge isn't the SDR part (a USRP would work just fine as
>> long as you get a daughter card that supports L-band) - it's the
>> "scenario building" which requires simulating the orbits of the GNSS
>> satellites, simulating the track of the receiver, calculating the time
>> delays (including iono and tropo effects), and generating the PN codes
>> appropriately.
>> 
>> Each of those isn't too tough, but putting it all together is quite
>> challenging, and, apparently, it's not "dissertation topic" suitable
>> (which is where a lot of niche SDR stuff comes from).
>> 
>> A *real* challenge is that to do it right, you need very good orbit
>> propagators - if you're looking to simulate nanosecond scale
>> phenomenology, you need to be able to generate orbit behavior on a few
>> cm or better sort of uncertainty.  For some applications (differential
>> GPS, RTK surveying) you could probably get away with something that's
>> not