On Wed, Mar 26, 2025 at 10:43 AM Marcus D. Leech <[email protected]>
wrote:

> On 26/03/2025 06:13, [email protected] wrote:
> >
> > I'm using timed commands to set the RX gain at a precise moment with
> > the following command:
> >
> > set_command_time(md.time_spec + uhd::time_spec_t(0.02), 0);
> >
> > However, I noticed that there is a delay between the specified time
> > and the actual time when the gain is applied. This delay is
> > significantly larger than the component latency responsible for
> > changing the gain and appears to depend on the sampling frequency.
> > Specifically, the delay is approximately 20 samples.
> >
> > I’m trying to understand why this delay is much greater than the
> > expected component latency and why it scales with the sampling
> > frequency. Any insights on this behavior?
> >
> > Regards.
> > Jamaleddine
> >
> >
> A change in signals presented to the head of the DDC chain will take
> some number of sample times to propagate through the
>     finite-length filters in the DDC.  They don't (and, indeed, cannot)
> have zero group delay.
>

Hi Marcus,
I think that the gain is set from the "radio" block which operates at the
master clock rate rather than the downconverted rate.  It doesn't make
sense to me why the latency of the gain setting would be related to the
downconverted sample rate.
Rob
_______________________________________________
USRP-users mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to