Hi,
I am currently facing a problem using the C API with a USRP N210 r4. I am 
trying to stream a int16_t interleave IQ file. When using the included C++ 
tx_samples_from_file demo, the baseband is streamed without interruptions or 
other issues. The BB is received at the DUT and the symbols seem correct.

Unfortunately, integration with a bigger software requires usage of the C API. 
What is unclear is how does the C and C++ API compare in handling the samples? 
Seems to me there is no direct mapping between the std::complex<short> and 
interleaved int16_t for IQ. When analyzed with a spectrum analyzer, the 
baseband streamed look completely different.

Details on the implemented functions and problem:
Issue Description
I am using the C API to stream a binary data file. The file is saved as 
interleaved int16_t [I][Q] samples. I am using an N210r4 with the latest UHD 
and fpga image. MTU on the NIC is configured at 3000, buffer size is 10000 
samples. The same file, when streamed with the included tx_samples_from_file 
example works fine and the baseband is received correctly. When the equivalent 
code is written in C, the baseband is not correct.

Setup Details
Implemented C code streaming loop:

while (1) {
        if (stop_signal_called)
         break;

        uhd_tx_metadata_make(&md, false, 0, 0.1, false, false);
        size_t read = fread(buff, sizeof(int16_t), samps_per_buff, file);
        for(int i = 0; i < read; i++){
            printf("%d \n", buff[i]);
        }

        if(read > 0){
            uhd_tx_streamer_send(tx_streamer, buffs_ptr, read, &md, 0.1, 
&num_samps_sent);
            total_num_samps += num_samps_sent;
        }
        else
            break;
        if (verbose)
            printf("\n Sent %ld - from file %ld\n ", total_num_samps, read);
    }
buff containes the data block to stream and is defined as buff = 
malloc(samps_per_buff*sizeof(int16_t));

C metadata

uhd_stream_args_t stream_args = {
        .cpu_format    = "sc16",
        .otw_format    = "sc16",
        .args               = "",
        .channel_list   = 0,
        .n_channels   = 1};

Reference C++ streaming loop:

void send_from_file(
    uhd::tx_streamer::sptr tx_stream, const std::string& file, size_t 
samps_per_buff)
{
    uhd::tx_metadata_t md;
    md.start_of_burst = false;
    md.end_of_burst   = false;
    std::vector<samp_type> buff(samps_per_buff);
    std::ifstream infile(file.c_str(), std::ifstream::binary);

    // loop until the entire file has been read

    while (not md.end_of_burst and not stop_signal_called) {
        infile.read((char*)&buff.front(), buff.size() * sizeof(samp_type));
        size_t num_tx_samps = size_t(infile.gcount() / sizeof(samp_type));

        md.end_of_burst = infile.eof();

        const size_t samples_sent = tx_stream->send(&buff.front(), 
num_tx_samps, md);
        if (samples_sent != num_tx_samps) {
            UHD_LOG_ERROR("TX-STREAM",
                "The tx_stream timed out sending " << num_tx_samps << " samples 
("
                                                   << samples_sent << " 
sent).");
            return;
        }
    }

    infile.close();
}
Reference C++ metadata

uhd::stream_args_t stream_args("sc16", "sc16");
channel_nums.push_back(boost::lexical_cast<size_t>(channel));
stream_args.channels             = 0;
uhd::tx_streamer::sptr tx_stream = usrp->get_tx_stream(stream_args);
Expected Behavior
I would expect the two samples to perform exactly the same. The baseband should 
be identical

Actual Behaviour
Once shown on a spectrum analyzer, the C example shows a much larger gain an 
the baseband appears fragmented. I don't understand how the C api handles the 
buffer streaming. According to the code the C function wraps the exact same 
behavior of the send call.

Steps to reproduce the problem
I can provide source code for the two examples and binary file. Both samples 
are executed at 2.5Msps

Question
What am I missing here to correctly stream the baseband? My understanding is 
that once the data type is fixed in the streaming metadata, we call 
uhd_tx_streamer_send with the number of samples that we want to stream (which 
is what the C++ example does using as type std::complex<short>. In the C case, 
how do we achieve the same behavior?

I would really appreciate any support in this matter.

Best regards
Marco Spanghero

_______________________________________________
USRP-users mailing list -- usrp-users@lists.ettus.com
To unsubscribe send an email to usrp-users-le...@lists.ettus.com

Reply via email to