Alan wrote:

An interesting point re 3 use-cases and re the vast majority of commercial algorithms being made for telco purposes. Does that imply that 10MHz lab-instrument feed for frequency / timing etc would ideally have some different algorithms?

Frankly, I see no need for a lab standard to make any attempt to keep "good" time when the GPS or the OCXO is compromised. Detect the error, set an alarm, and disable the standard frequency output. Who wants to do time-nut experiments with a standard that is limping along with compromised accuracy?

It's perfectly fine if the standard automatically returns to normal operation when it can (in this case, it should maintain a record of the outage(s)), but it is also perfectly fine (IMO) to require manual intervention to resume normal operation.

Again -- Who wants to do time-nut experiments with a standard that is limping along with compromised accuracy?

It is a different situation if the standard provides timing for a real-time application, such as a cell site. In that case, one probably wants to maintain operation as long as the standard can be relied on to meet the minimum system spec.

Best regards,

Charles



_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to