I'm testing the behavior of B210-based systems, comparing the performance
with "internal" and "external" (10 MHz) clock source. Expect for the
following "is the 10 MHz input actually present" check running when the app
starts, the two branches share the same code.

rfBoardPtr->set_clock_source( "external" );
sleep( 2 );//give board time to lock
if ( rfBoardPtr->get_mboard_sensor( "ref_locked" ).to_bool( ) == false )
{
    throw std::runtime_error( "Unable to find a valid 10 MHz reference
signal. Please check that the signal source is properly plugged in." );
}
rfBoardPtr->set_time_unknown_pps( 0.0 );


Besides that check, is there a way of measuring the quality of the signal
via (UHD) software API, ideally in a more granular way? The check above
"passes" even when the input signal is poor, which I see by validating
through external instruments the quality of the radio signal emitted by the
board. Ideally, I'd want an API that tells me about such problems before I
actually check the radio output. To be clear, these are relatively-minor
radio issues, but are sufficient to reduce the DL peak rate of my LTE
system from 150 Mbps to 50-100 Mbps with respect to a fully-functional
board (either fed by "internal" clock source, or by a proper 10 MHz
source). The quality of the radio output varies noticeably (at least when
measured with advanced full-stack metrics) when I change the amplitude of
the 10 MHz reference, which is surprising since said changes are within the
recommended range of the 10 MHz reference. Could someone please confirm
said specs, in terms of (peak-to-peak) amplitude and waveform (square,
sine, ...)?

Thanks,
Dario
_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to