It's actually a legit measurement, but it's basically how much total jitter
+ bit-proportioned rate error the rx signal can have as a percentage of bit
time, and doesn't matter nearly as much when receiving from all-electronic
transmitters, vs mechanical such as TTY.

I'm sure it made the marketing dept happy to have a way to claim it
tolerates >45% error insteat of only 2.5% error, even though it's just a
different way of specifying it. Basically it's a brag about 16x
oversampling, except that it was essentially meaningless since all the
other UARTs on the market used the same 16x oversampling.

Eric



On Jun 13, 2017 9:33 PM, "Jon Elson via cctalk" <cctalk@classiccmp.org>
wrote:

On 06/13/2017 07:59 PM, Chuck Guzis via cctalk wrote:

> Well, I didn't say "timing error", I did say "timing distortion", which is
> not quite the same thing. My reference was the "TR1602/TR1863/TR1865
> MOS/LSI Application Notes Asynchronous Receiver Transmitter", which can be
> found in the WD 1984 Data Communications Handbook (I think there's a copy
> online). Page 126-127. "Thus, signals with up to 46.875% distortion could
> be received."
>
Well, I think it is wild market-speak inflation.  Yes, if everything else
was perfect, then as long as the serial data was at the correct level while
the UART sampled the signal, the rest could be garbage. But, how will a
seriously degraded channel ALWAYS pass the signal correctly just when the
UART samples it?  NOT very likely.

Jon

Reply via email to