On 7/18/16 8:51 AM, Scott Stobbe wrote:
I suppose it is one of those cases where, the GPS designers decided you
shouldn't ever use the serial data for sub-second timing, and consequently
spent no effort on serial latency and jitter.

Most UARTs I have come across have been synthesized with a 16x baud clock
and included flow control. It would not have been too much effort to spec
latency as some mu ±100 ns and jitter of ±1/(16*baud).

For 9600 baud, the jitter on the start bit would be ±6.5 us.

If CTS was resampled a 1 full bit time (9600 baud), the jitter would
be ±104 us.



except that virtually every UART in use today has some sort of buffering (whether a FIFO or double buffering) between the CPU interface and the bits on the wire, which completely desynchronizes the bits on the wire from the CPU interface.

Determinism in UART timing between the CPU bus interface and the "bits on the wire" has never been something that is specified. You can go back to venerable parts like the 8251, and there's no spec in the data sheet. ( there's a tCR specified as 16 tCY for the read setup time from CTS*, DSR* to READ* assert. And tSRX (2 usec min) and tHRX (2 usec min) for the setup and hold of the internal sampling pulse relative to RxD. And 20 tCY as a max from center of stop bit to RxRDY, and then whatever the delay is from the internal RxRDY to the bus read)


There's "what we observed in a running circuit" or "what we inferred from knowing the internal design".


Since a huge number of serial ports these days are implemented with a USB interface, the timing uncertainty is even greater, because you're dealing with the 8kHz frame timing on USB.


This is why PTP compatible interfaces added time tagging to the PHY layer.



_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to