Hi all,

I'm using a C++ program that modulates a signal that I feed to GRC via an 
mkfifo (named pipe) file.  The problem is that I get underruns from the USRP.  

There is a function in the C++ program that converts from double to float and 
then dumps it to stdout which I then feed to an mkfifo file in the command 
line.  With the buffer size of 32 bytes in the stdout line in C++  I get 
consistent underruns.  With the buffer size to 4096 bytes, I get an initial 
underrun and then none for about 1-2 minutes and then I start getting them more 
consistently, and about 500msec of no signal for each underrun.  With the 
buffer size set to the size of a 2 second packet (at 500 kHz) I get an underrun 
every 4 seconds with  a 1-2 seconds of no signal randomly in between my 
modulated signal.

Does anyone have any ideas if I need to use a certain buffer size?  I tried 
this on my netbook with an intel atom and also laptop with a duo core and got 
the same thing.  

When I run the C++ program and output to a regular file, and read in with the 
GRC script at the same time, I don't get any underruns at all, but of course 
this isn't real time.

My other idea is to find a way to do the conversion from double to float in GRC 
with a custom block.  I also wonder if going from stdout to a pipe file and 
then reading the pipe file from GRC is also a problem.


Thanks-Tom
_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Reply via email to