Hello everyone,
I have a problem with OFDM frame detection for one of my channels. I work
with gr-ieee802-11 blocks and receive from 4 channels using X300 and 2
TwinRX. For the frame detection, the receive blocks calculate the
autocorrelation of the incoming stream over a specified window size and
divide it by the average power over this window. After that, the normalized
autocorrelation value is compared with a threshold to determine whether
there is a frame or not and 'wifi_start' tag is inserted if there is a
frame.

This is what I see the FFT output of the channels( https://imgur.com/a/UeXTU
). For the other channels, I see this wifi_start tag with a period
corresponding to my interval set by the transmitter. But the channel 1
which is A:1 of my USRP shows the tag all the time. When I checked the
correlation value of the channels, I saw that this channel always has a
correlation value around 0.7( https://imgur.com/a/KHgXn ). It shows
wifi_start tag since I have threshold 0.56. When I increase the receiver
gain of this channel, the normalized correlation value is decreasing but
the constellation gets more scattered with high receive gain and it s hard
to decode with this constellation. I set my threshold 0.75 for this
channel, I was able to decode but it did not work as properly as the other
channels(missing frames, high frame error etc.).
Do you have any guess about the issue? Why do I have a high correlation
value for this channel although the other channels have similar values to
each other? Can I assume that there is something wrong with the receive
chain of this channel?

Best,
-Bugra
_______________________________________________
USRP-users mailing list
USRP-users@lists.ettus.com
http://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com

Reply via email to