Inside the adc block with interleave mode on, the decimate-by-8 downsample 
blocks have offsets of 1,3,5,7 on channels i0-i3, and offsets of 0,2,4,6 on 
channels q0-q3, for 800MHz sampling, and sample time = 0.25.  This all seems 
reasonable, but the delays seem inconsistent -- for the i-channels the delays 
are all 1, but for the q-channels the first is 2 and the rest are 1.  When I 
uncheck the "ADC interleave mode" box, the downsample blocks change to decimate 
by 4, and the i- and q-channel delays are both set with the first at 2 and the 
rest 1.  These delays seem strange to me -- I expect them to all be the same 
and have the downsample blocks take care of the delays, or have successively 
longer delays while keeping the downsample blocks the same.  So having one 
delay set at 2 and the rest at 1 seems strange, and having the i-channels and 
q-channels behave differently for interleave-mode on vs. off seems strange.  
When interleave mode is turned off,
 the downsample blocks are running the function sdspdsamp2, but since they're 
all running the same version of that function, shouldn't the delays be set as 
1,2,3,4, instead of 2,1,1,1?

I didn't find an explanation for this in the CASPER memos, and Jason says he 
didn't exped the adc block to be set up like this.  Does anyone else know why 
the delays are set up like this?  (I'm using the 10.1 tools on 
otto.eecs.berkeley.edu, in case it helps)

Thanks,
Jonathan Landon




      

Reply via email to