Hello, I have been evaluating the OFDM examples (at the beginning of my
Bachelor Thesis)
In ofdm_rx I have added noise source and file sinks to plot BER(SNR) in
matlab. I added a Tag Debug on the output.
The result is disappointing, I observe plenty of lost packets already at
the SNR of about 13dB and below (noise amplitude about 1.5 in an
unmodified ofdm_rx).
So are those examples useful, with such a performance, to utilize with
USRP and antennas?
Which part of a synchronization process fails, that causes missing
indexes of data packets (observed in a Tag Debug) in a noisy channel)?
I think it's at the loop of 'header recognition', so it's the 'coarse
frequency offset' that is too big for 'channel estimator'.
I mainly want to know if it's only possible to use ofdm_rx with a low
noise, with SNR over about 13 dB.
Mateusz Loch
_______________________________________________
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio