Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-10 Thread Robert McGwier
I think that many RF and wireless designers make a lot of assumptions as
well and never admit to the horror of the mobile channel.   LTE is not
really realizing the needed increase in capacity to justify the rollout cost
of all of the needed infrastructure in my opinion but now everyone is all
in.  I am sure Sprint and others are happy LTE and WiMax are both having
lots of problems.

It is truly exceedingly difficult to enable MIMO big gains when lots of
mobile channels, each a horror, need to be managed to achieve any real
advantages.  Welcome to reality!  Verizon can't put a cloud computer at
every cell site to enable the rapid channel inverse caclulations and MIMO
solution per user to optimize the system.

;-)

Bob
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-09 Thread Tom Rondeau
On Mon, Aug 8, 2011 at 11:34 AM, shantharam balasubramanian
shantharam...@gmail.com wrote:
 Hello people,

 Thanks a lot for the reply. So, you are saying that the packet loss at
 low transmitter amplitude in benchmark_Tx.py and benchmark_Rx.py come
 from the loss of packet synchronization data, i.e., a part of the
 packet synchronization data gets lost or degraded due to low SNR. I
 know that packet loss also occurs if the queue length is not long
 enough to hold incoming traffic or if the data reaches the receiver
 after a long time. I want to ensure that these things don't happen.

 Basically, we are trying to transmit a random binary sequence between
 two nodes using the benchmark_Tx.py and benchmark_Rx.py programs.
 Since both these programs use packets, we convert the whole binary
 sequence into a number of packets and then transmit each packet using
 the benchmark programs. Let us say, we have 2000 bits in total. We
 convert it to 20 packets (100 bits/packet) and then transmit 20
 packets from one node to the other. We want to calculate the bit error
 rate for different levels of transmitter amplitude.

 Based on our objectives, I have the following questions.

 1. Is there any upper limit on the no. of binary bits per packet in
 benchmark_Tx.py program?

 2. Overall can I take other steps to get rid of the packet loss
 scenario ? Adam, can you please describe the generation of long
 preamble in further details ?

 Your help and feedback will be highly appreciated.

 Thanks,
 shantharam

Keep in mind the old information theorist's adage: if you don't have
bit errors, you're using too much power! (ok, I don't know how old
that is; fred harris always quotes it, but he credits someone else
with it, probably Tony Constantinides).

In other words, we normally design our systems around having bit
errors, and indeed we recognize that they are unavoidable except under
extreme SNR conditions. To compensate, you really want to you some
kind of channel coding. The way things are in our benchmark code, a
single bit error means that an entire packet is lost.

Tom

___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-09 Thread Marcus D. Leech

Keep in mind the old information theorist's adage: if you don't have
bit errors, you're using too much power! (ok, I don't know how old
that is; fred harris always quotes it, but he credits someone else
with it, probably Tony Constantinides).

In other words, we normally design our systems around having bit
errors, and indeed we recognize that they are unavoidable except under
extreme SNR conditions. To compensate, you really want to you some
kind of channel coding. The way things are in our benchmark code, a
single bit error means that an entire packet is lost.

Tom

In radio systems, I agree--bit errors are a fact of life, and you can 
cope with them either with protocol design, or
  frame design. The trend in the last couple of decades for radio 
systems has been to incorporate some sort of
  FEC, to reduce the impact of channel distortions--the receiver can 
simply reconstruct from the FEC data, or,
  force a re-transmit.  Systems that use FEC almost always assume that 
there's a higher-layer protocol mechanism
  in place for dealing with packets that were too damaged to decode, 
and thus must be re-transmitted.


On the other hand, there are plenty of extant *wired* communications 
systems in which bit-errors are exceedingly rare.
  The various Ethernet standards for example, assume that bit errors 
aren't common, and there's no FEC (at least at

  100Mbit and 10Mbit levels--I'm not sure about 1000Mbit and 1Mbit).

The problem is that many communications/networking engineering types who 
are new to radio don't really understand, on a
  visceral level, that the radio channel environment is different from 
wired, not just in degree, but in type, of channel distortions.
  And further, their experience with a channel model for wireless may 
include only simulations, rather than real world.


My very earliest internet connection at home, back in the mid-to-late 
1980s, was wireless.  Over an amateur-radio 56KBps radio
  link using a split repeater on 220Mhz and 432Mhz.  It wasn't a very 
nice environment.  Various RFI issues, hidden-terminal issues.
  Collisions.  Multi-path.  Receiver de-sense.  And a complete lack of 
any FEC.   Given all of that, I'm stunned that LTE and WiFi and all its

  modern friends work at all :-)


--
Marcus Leech
Principal Investigator
Shirleys Bay Radio Astronomy Consortium
http://www.sbrac.org



___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-09 Thread Colby Boyer
On Tue, Aug 9, 2011 at 5:22 PM, Marcus D. Leech mle...@ripnet.com wrote:

  Keep in mind the old information theorist's adage: if you don't have
 bit errors, you're using too much power! (ok, I don't know how old
 that is; fred harris always quotes it, but he credits someone else
 with it, probably Tony Constantinides).

 In other words, we normally design our systems around having bit
 errors, and indeed we recognize that they are unavoidable except under
 extreme SNR conditions. To compensate, you really want to you some
 kind of channel coding. The way things are in our benchmark code, a
 single bit error means that an entire packet is lost.

 Tom

  In radio systems, I agree--bit errors are a fact of life, and you can
 cope with them either with protocol design, or
  frame design. The trend in the last couple of decades for radio systems
 has been to incorporate some sort of
  FEC, to reduce the impact of channel distortions--the receiver can simply
 reconstruct from the FEC data, or,
  force a re-transmit.  Systems that use FEC almost always assume that
 there's a higher-layer protocol mechanism
  in place for dealing with packets that were too damaged to decode, and
 thus must be re-transmitted.

 On the other hand, there are plenty of extant *wired* communications
 systems in which bit-errors are exceedingly rare.
  The various Ethernet standards for example, assume that bit errors aren't
 common, and there's no FEC (at least at
  100Mbit and 10Mbit levels--I'm not sure about 1000Mbit and 1Mbit).

 The problem is that many communications/networking engineering types who
 are new to radio don't really understand, on a
  visceral level, that the radio channel environment is different from
 wired, not just in degree, but in type, of channel distortions.
  And further, their experience with a channel model for wireless may
 include only simulations, rather than real world.

 My very earliest internet connection at home, back in the mid-to-late
 1980s, was wireless.  Over an amateur-radio 56KBps radio
  link using a split repeater on 220Mhz and 432Mhz.  It wasn't a very nice
 environment.  Various RFI issues, hidden-terminal issues.
  Collisions.  Multi-path.  Receiver de-sense.  And a complete lack of any
 FEC.   Given all of that, I'm stunned that LTE and WiFi and all its
  modern friends work at all :-)



 --
 Marcus Leech
 Principal Investigator
 Shirleys Bay Radio Astronomy Consortium
 http://www.sbrac.org



 __**_
 Discuss-gnuradio mailing list
 Discuss-gnuradio@gnu.org
 https://lists.gnu.org/mailman/**listinfo/discuss-gnuradiohttps://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Also point to point wireless comm. with a single antenna is also more or
less a solved problem these days. Unless you are trying to do something in a
weird channel or with very very low power. I guess they have said coding is
dead many times in the past and have been wrong. Or thought CDMA or TDMA is
king, etc, etc.

Things are more interesting when you look at PHY/MAC cross layer work.

I'm also surprised they get LTE to work...16 b/s/hz
spectral efficiencythats pretty up there.
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-09 Thread John Orlando
 I'm also surprised they get LTE to work...16 b/s/hz
 spectral efficiencythats pretty up there.

And LTE-Advanced is spec'ing 30 bits/sec/hz at peak spectral
efficiency...which makes my head hurt just thinking about the
complexity both in the RF hardware as well as the PHY processing to
achieve this.

-- 
John Orlando
CEO/System Architect
Epiq Solutions
http://www.epiq-solutions.com

___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-08 Thread Martin Braun
On Sun, Aug 07, 2011 at 02:47:10PM -0400, Daniel Zeleznikar wrote:
 As Marcus pointed out, there will always be packet loss associated with
 higher BER in real systems that employ packet switching. The only thing I
 could suggest is that you somehow move further upstream in your receive
 system to make the BER measurement happen at the bit/symbol level if you
 really want it to be accurate. Otherwise you can just collect data for both
 packet loss rate, and the BER of obtained packets, since that is the best
 thing you have to work with. The packet loss statistic may be important
 anyhow.
 Suggestions from anyone for moving the BER measurement closer to symbol
 level?

It's not quite clear what exactly you're trying to do. Here's some more
suggestions:
- Don't use benchmark_{tx,rx} scripts, but write something that doesn't
  do CRC checks on packets. However, you still  have to synchronize,
  that'll eventually conk out at very low SNR.
- Don't use USRP's at all. If you're running simulations (e.g. to test a
  channel coding scheme), do it all in GRC. You can either use the
  channel model that comes with GNU Radio to increase SNR, or, of you
  want to stick to bits, try the Channel Coding Toolbox from
  https://www.cgran.org/wiki/chancoding which includes some elaborate
  bit error models.

MB



 
 
 On Sat, Aug 6, 2011 at 6:54 PM, Marcus D. Leech mle...@ripnet.com wrote:
 
  **
  On 08/06/2011 06:27 PM, shantharam balasubramanian wrote:
 
  Hi
  I have been working in usrp2 testbed, and I have been modifying the
  benchmark_tx and rx programs for my project. There have been situations
  where I was supposed to introduce noise to find out BER. I did that by
  giving lower  transmitter amplitude values. But very low values cause packet
  loss along with higher BER values. I just want to know if there Is there
  anyway to just cause high BER values, without causing packet loss? Is there
  any way I can do that inside the program or should I do it by any other way
  e.g.by using some noise producing source?
 
   Well, in real-world radio communications systems, low-SNR *does* cause
  packet loss.  That's entirely expected.  Nature doesn't discriminate
between packet-synchronization data, and the actual payload data.
 
 
  --
  Marcus Leech
  Principal Investigator
  Shirleys Bay Radio Astronomy Consortiumhttp://www.sbrac.org
 
 
  ___
  Discuss-gnuradio mailing list
  Discuss-gnuradio@gnu.org
  https://lists.gnu.org/mailman/listinfo/discuss-gnuradio
 
 
 
 
 -- 
 Dan Zeleznikar
 daniel.zelezni...@gmail.com
 zeleznika...@osu.edu
 (216) 233-6232

 ___
 Discuss-gnuradio mailing list
 Discuss-gnuradio@gnu.org
 https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


-- 
Karlsruhe Institute of Technology (KIT)
Communications Engineering Lab (CEL)

Dipl.-Ing. Martin Braun
Research Associate

Kaiserstraße 12
Building 05.01
76131 Karlsruhe

Phone: +49 721 608-43790
Fax: +49 721 608-46071
www.cel.kit.edu

KIT -- University of the State of Baden-Württemberg and
National Laboratory of the Helmholtz Association



pgphtG0ufcvDs.pgp
Description: PGP signature
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-08 Thread shantharam balasubramanian
Hello people,

Thanks a lot for the reply. So, you are saying that the packet loss at
low transmitter amplitude in benchmark_Tx.py and benchmark_Rx.py come
from the loss of packet synchronization data, i.e., a part of the
packet synchronization data gets lost or degraded due to low SNR. I
know that packet loss also occurs if the queue length is not long
enough to hold incoming traffic or if the data reaches the receiver
after a long time. I want to ensure that these things don't happen.

Basically, we are trying to transmit a random binary sequence between
two nodes using the benchmark_Tx.py and benchmark_Rx.py programs.
Since both these programs use packets, we convert the whole binary
sequence into a number of packets and then transmit each packet using
the benchmark programs. Let us say, we have 2000 bits in total. We
convert it to 20 packets (100 bits/packet) and then transmit 20
packets from one node to the other. We want to calculate the bit error
rate for different levels of transmitter amplitude.

Based on our objectives, I have the following questions.

1. Is there any upper limit on the no. of binary bits per packet in
benchmark_Tx.py program?

2. Overall can I take other steps to get rid of the packet loss
scenario ? Adam, can you please describe the generation of long
preamble in further details ?

Your help and feedback will be highly appreciated.

Thanks,
shantharam

___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-07 Thread Colby Boyer
On Sat, Aug 6, 2011 at 3:54 PM, Marcus D. Leech mle...@ripnet.com wrote:

 **
 On 08/06/2011 06:27 PM, shantharam balasubramanian wrote:

 Hi
 I have been working in usrp2 testbed, and I have been modifying the
 benchmark_tx and rx programs for my project. There have been situations
 where I was supposed to introduce noise to find out BER. I did that by
 giving lower  transmitter amplitude values. But very low values cause packet
 loss along with higher BER values. I just want to know if there Is there
 anyway to just cause high BER values, without causing packet loss? Is there
 any way I can do that inside the program or should I do it by any other way
 e.g.by using some noise producing source?

  Well, in real-world radio communications systems, low-SNR *does* cause
 packet loss.  That's entirely expected.  Nature doesn't discriminate
   between packet-synchronization data, and the actual payload data.


 --
 Marcus Leech
 Principal Investigator
 Shirleys Bay Radio Astronomy Consortiumhttp://www.sbrac.org


 ___
 Discuss-gnuradio mailing list
 Discuss-gnuradio@gnu.org
 https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Perhaps he could design a long preamble sequence, many symbols, and use that
to correlate against. That way you can assure packet lock, but symbol
decoding might not always work.
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-07 Thread Daniel Zeleznikar
As Marcus pointed out, there will always be packet loss associated with
higher BER in real systems that employ packet switching. The only thing I
could suggest is that you somehow move further upstream in your receive
system to make the BER measurement happen at the bit/symbol level if you
really want it to be accurate. Otherwise you can just collect data for both
packet loss rate, and the BER of obtained packets, since that is the best
thing you have to work with. The packet loss statistic may be important
anyhow.

Suggestions from anyone for moving the BER measurement closer to symbol
level?

On Sat, Aug 6, 2011 at 6:54 PM, Marcus D. Leech mle...@ripnet.com wrote:

 **
 On 08/06/2011 06:27 PM, shantharam balasubramanian wrote:

 Hi
 I have been working in usrp2 testbed, and I have been modifying the
 benchmark_tx and rx programs for my project. There have been situations
 where I was supposed to introduce noise to find out BER. I did that by
 giving lower  transmitter amplitude values. But very low values cause packet
 loss along with higher BER values. I just want to know if there Is there
 anyway to just cause high BER values, without causing packet loss? Is there
 any way I can do that inside the program or should I do it by any other way
 e.g.by using some noise producing source?

  Well, in real-world radio communications systems, low-SNR *does* cause
 packet loss.  That's entirely expected.  Nature doesn't discriminate
   between packet-synchronization data, and the actual payload data.


 --
 Marcus Leech
 Principal Investigator
 Shirleys Bay Radio Astronomy Consortiumhttp://www.sbrac.org


 ___
 Discuss-gnuradio mailing list
 Discuss-gnuradio@gnu.org
 https://lists.gnu.org/mailman/listinfo/discuss-gnuradio




-- 
Dan Zeleznikar
daniel.zelezni...@gmail.com
zeleznika...@osu.edu
(216) 233-6232
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


[Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-06 Thread shantharam balasubramanian
Hi
I have been working in usrp2 testbed, and I have been modifying the
benchmark_tx and rx programs for my project. There have been situations
where I was supposed to introduce noise to find out BER. I did that by
giving lower  transmitter amplitude values. But very low values cause packet
loss along with higher BER values. I just want to know if there Is there
anyway to just cause high BER values, without causing packet loss? Is there
any way I can do that inside the program or should I do it by any other way
e.g.by using some noise producing source?
___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Re: [Discuss-gnuradio] Introducing noise/ considerable BER

2011-08-06 Thread Marcus D. Leech

On 08/06/2011 06:27 PM, shantharam balasubramanian wrote:

Hi
I have been working in usrp2 testbed, and I have been modifying the 
benchmark_tx and rx programs for my project. There have been 
situations where I was supposed to introduce noise to find out BER. I 
did that by giving lower  transmitter amplitude values. But very low 
values cause packet loss along with higher BER values. I just want to 
know if there Is there anyway to just cause high BER values, without 
causing packet loss? Is there any way I can do that inside the program 
or should I do it by any other way e.g.by http://e.g.by using some 
noise producing source?
Well, in real-world radio communications systems, low-SNR *does* cause 
packet loss.  That's entirely expected.  Nature doesn't discriminate

  between packet-synchronization data, and the actual payload data.


--
Marcus Leech
Principal Investigator
Shirleys Bay Radio Astronomy Consortium
http://www.sbrac.org

___
Discuss-gnuradio mailing list
Discuss-gnuradio@gnu.org
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio