Hello,

I'm trying to implement an AGC software in our RF application. The setup is
the following:

   - B200mini
   - master clock rate: 16 MSPS
   - fs = 32 KSPS
   - Receive samples in blocks of 32k (1 second).

I made a tiny program, where I change rx_gain every 1 seconds (just after
rx_streamer->recv() returns) adding or substracting 30 dB.
To find out when a change in rx_gain actually affects the received samples
I implemented a simple detection algorithm.

Ignoring the non-deterministic latency, at least *I would expect the time
between changes to be 1 seconds as well* (µ = 1 seconds, with some non-zero
variance), the same periodicity they were invoked with.

However, this change takes on average 1.016 seconds (see figure
<https://drive.google.com/open?id=1j1kiJt334FjOwczgbYsp0IITNlIkpeZR>). *This
means for each set_rx_gain call the radio is taking longer and longer to
apply the change *(see figure
<https://drive.google.com/open?id=10u3X7Yoa5mlCbUbtmddMxVuTF61QQDYR>).
Example:

t = 1 s -> Recv previous block and change gain
t = 2 s -> Recv previous  block and change gain
t = *1.016* -> Change in gain detected in previous block *(OK, latency to
change gain is around ~0.016 ms)*
t = 3 s -> Recv previous block and change gain
t = *2.032* s -> Change in gain detected in previous block  *(I would
expect ~2.016)*
t = 4 s -> Recv previous block and change gain
t = *3.048* s -> Change in gain detected in previous block *(I would expect
~3.016)*
(...)
t = 60 s -> Recv previous block and change gain
t = *59.96* s -> Change in gain detected in previous block *(After 60th
call to rx_gain, latency has increased to almost 1 second!)*

Is there any rational reason why this is happening?
(I've tried longer periods, up to 10 seconds, with the same result).

Regards,
Brais.
_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to