I'm doing some work involving measuring latencies of communications
over serial ports. To avoid clock synchronizations issues if we were
running on separate machines, a configuration is a modem hooked into
/dev/cuaa0 and another in /dev/cuaa1. We talk to the modem on cuaa0
which calls the modem on cuaa1, we tell it to answer, and then we
throw data back and forth and take timestamps.

Right now, all of the code is running in userland.

I am trying to figure out what tuning we could do to get things as
accurate as possible. That is, the information we want is the time
that a bunch of bits to leave the COM port versus when they arrive on
the other one. Since things look more like,

   Userland    |    OS      | Comms Hardware |
               |            |                |
 [measuring]<->|<-[ sio  ]->|<---- UART ---->|<------->
 [ program ]   |  [driver]  |                |

And this doesn't account for delays between when we get the data in
userland and then have to make gettimeofday() calls for timestamps and
other potential delays.

I'm concerned how far off our measurements in userland will be from
when bits actually arrive and leave on the wire. The data we are
concerned with has latencies of a few 100 ms, but calibrations on the
PSTN are a typically 50-ms-ish. We need to have a few significant
digits below that.

Any pointers?
-- 
Crist J. Clark                     |     [EMAIL PROTECTED]
                                   |     [EMAIL PROTECTED]
http://people.freebsd.org/~cjc/    |     [EMAIL PROTECTED]
_______________________________________________
[EMAIL PROTECTED] mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-hackers
To unsubscribe, send any mail to "[EMAIL PROTECTED]"

Reply via email to