Hi everyone,
                  I am interested in knowing the delay jitter of the total
transmission time of a packet/waveform. Specifically, I want to know the
time between the time the flowgraph is tsarted in python (tb.run or
tb.start) and the time that the first sample is transmitted into the air
from the USRP hardware. I want to know if I can reduce the jitter in this
time (across runs) to as low a value as possible.

I use usrp_siggen.py with gnuradio-3.1.2 to transmit a square waveform .I
try to start the flowgraph at a precise time x (in microseconds) by using
y=x-time.time() and time.sleep(y) and then tb.start(),time.sleep(0.1) and
tb.stop(). I also use an interp of 32 at the Tx. I have a receiver that logs
all data (with -d 64). I observe the samples just out of the usrp source.

For transmissions that are precisely spaced in time using appropriate values
x, the inter transmission delay measured at  the receiver is off by a value
> than 2 ms.
(despite using nice to increase the priority of this process).

1. Is it possible to reduce the jitter between successive Tx.delay
measurements to be few 10's of microseconds or lesser?

2. Is there a way to run the usrp_siggen code as a kernel module to improve
the delay jitter performance?

Thanks in advance for your help,
Sri
_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to