Re: CVS-2-2 NMApplet empty bar explained

2005-02-04 Thread Bill Moss
Take a look at the next message I sent about patching the NM 
nm_wireless_qual_to_percent function. If a driver does not report 
max_qual-level and max-qual-noise, make some good guess. The physical 
lower limit for what a card can do is about -96 dBm. The upper limit 
varies but is always less than -10 dBm. For the ipw2200, -20 dBm is 
about right. Percentage is at best a rough indicator and does not 
directly relate to a physical quantity so I don't think it matters that 
much to get it exactly right. If the driver does not specify 
max_qual-level and max-qual-noise, how about setting

max_qual-level to 221 (-35 dBm)
max_qual-noise to 161 (-95 dBm)
and use a linear model for percentage like the one I put in the patch 
message. The 70 coefficient may need some adjustment for these values 
guessed values.

The CLAMP function will take care of any overshoot.
Dan Williams wrote:
On Fri, 2005-02-04 at 01:00 -0500, Bill Moss wrote:
 

Now look at the results for the scanned AP's (list of one here). qual 
and noise are reported as 0. This is why the CVS-2-2 NMApplet bar is 
empty for the ipw2200 driver. NM uses the qual value of zero to create 
the bar length. We can convert the level of 198 to -58 dBm but NM has no 
idea how ipw2200 would convert this into a percentage. to make matters 
worse the dBm range varies from driver to driver. We are stuck. There is 
no way the NM nm_wireless_qual_to_percent function can deal with this. A 
new nm_wireless_level_to_percent function could be created to do this 
conversion but it would be guess work at best and impossible to 
normalize across all drivers.
   

More or less correct... I spent much of last night looking at stuff
online trying to figure out ways to be able to use noise  signal levels
for something, but without much luck.  Drivers simply have to report the
noise, and many (like Atheros) hardcode the noise levels.  This doesn't
work well for the drivers because they have to support a couple
different actual chips, and sometimes the same value doesn't necessarily
work for all chips.  The real solution is to have drivers report their
own quality based off RSSI (which all cards MUST have to be 802.11b
compliant) and max RSSI.
Also, we pretty much need the max_qual.level as well if there's not
quality information.  As measured on my Netgear WG511T right next to my
Linksys WRT54G, that was a fairly consistent -39dBm (saw a -30 once
too).  However, unless I'm wrong, this varies by access point since each
access point puts out different levels of power and have different
antennas, both of which affect the actual received signal power on the
card.  However, using that max_qual.level value, we could take the noise
level of -95dBm and we have an upper and lower bound of some sort.
Interestingly enough, I found this page that appears to describe what
Windows uses, at least for some cards.  We may in the end need to do
some approximation of this to achieve a signal quality measurement
from just signal and noise.
http://is.med.ohio-state.edu/Wireless%20FAQ.htm
Cards use the SNR to determine what speed to drop down to, so some
combination of current card speed and SNR may work for us in the absence
of actual quality information reported from the driver.  Only as a last
resort of course.  But using card speed doesn't help us for scanned
access points, since we're not connected to them :(  The search
continues.
Dan
 

--
Bill Moss
Professor, Mathematical Sciences
Clemson University
___
NetworkManager-list mailing list
NetworkManager-list@gnome.org
http://mail.gnome.org/mailman/listinfo/networkmanager-list


Re: CVS-2-2 NMApplet empty bar explained

2005-02-04 Thread Bill Moss
I have seen my card spike to -15 dBm when held next to an AP. The Cisco 
lookup table starts at -10 dBm. I agree that [-90, -20]. Is a good range 
to start with. I think the cards will all output biased values (+256) 
for qual- level, max_qual-level, max_qual-noise but I could be wrong.

If agreement is ever reached on how to do the associated percentage 
calculation and the scanned percentage calculation, an appropriate patch 
could be submitted to each wireless driver project. The ipw2200 people 
seemed eager to have input on this issue. ipw2200.c is filled with TO DO 
and FIND A GOOD VALUE comments. The developers are just waiting for 
someone to bit the bullet.

Dan Williams wrote:
On Fri, 2005-02-04 at 10:54 -0500, Bill Moss wrote:
 

Take a look at the next message I sent about patching the NM 
nm_wireless_qual_to_percent function. If a driver does not report 
max_qual-level and max-qual-noise, make some good guess. The physical 
lower limit for what a card can do is about -96 dBm. The upper limit 
varies but is always less than -10 dBm. For the ipw2200, -20 dBm is 
about right. Percentage is at best a rough indicator and does not 
directly relate to a physical quantity so I don't think it matters that 
much to get it exactly right. If the driver does not specify 
max_qual-level and max-qual-noise, how about setting

max_qual-level to 221 (-35 dBm)
max_qual-noise to 161 (-95 dBm)
   

I've seen mention of cards capable of receiving around -80 dBm down to
-97 dBm.  I think around -90 dBm would be a good value there...
I also don't think I've ever seen a card (out of my stack of 6 and a few
more from friends and coworkers) that's been able to get above -20 dBm,
so that might be a good starter value.
We could also adjust this stuff on the fly (at least the upper bound
here) and maybe move that down as low as -30dBm over time, though that
gets more complicated.  I guess the answer would be fix the driver if
people complained enough.  Sounds like a good idea though.  Thanks for
the tip.
Dan
 

--
Bill Moss
Professor, Mathematical Sciences
Clemson University
___
NetworkManager-list mailing list
NetworkManager-list@gnome.org
http://mail.gnome.org/mailman/listinfo/networkmanager-list


Re: CVS-2-2 NMApplet empty bar explained

2005-02-04 Thread Jean Tourrilhes
On Fri, Feb 04, 2005 at 01:47:49PM -0500, Dan Williams wrote:
 On Fri, 2005-02-04 at 13:25 -0500, Sven wrote:
  thanks for the info - i have a few more questions/suggestions:
  
  if the card reports RSSI and i know MAX_RSSI, is the relative signal
  strength then  RSSI / MAX_RSSI (* 100)? if one wants to use that as the
  reported Link Quality but for some reason do not want to use %, what
  should one do? converting the RSSI #'s to dBm and using that to compute
  a % is clearly wrong. IMHO some better guidelines as to what Link
  Quality in WEXT should mean is desirable. borrowing definitions from
  Joshua Bardwell in
  http://www.connect802.com/download/techpubs/2004/you_believe_D100201.pdf,  
  from a practical point of view Signal Quality, though desirable as
  link quality, is probably not feasible to get a handle on with the
  (current and future) drivers. next best is probably Signal Strength -
  from the RSSI values. Or is SNR better as a measurement of link quality?
  but that would require a better reporting of noise by the drivers (and
  not just a hardcoding) .
 
 RSSI is totally manufacturer dependent.  AFAIK, Cisco uses a MAX_RSSI of
 63 for the 340/350, Atheros uses something like 30, etc.  It depends on
 how many voltage values the hardware can physically measure.  So yes,
 you do get a sort of Link Quality % when you take RSSI / MAX_RSSI * 100.
 You can (and should) augment this value with things like the ipw2200
 driver does, ie receive packet errors, link speed, etc.
 
 Converting RSSI to dBm and using that for link quality is actually
 pretty wrong.  dBm is actually useful though, you can do some
 interesting things with it like 1) distance from transmitter (if you
 know detailed antenna and card characteristics), 2) signal power levels
 and noise levels, 3) more accurately test different antennas, etc.  Its
 just not useful for getting a Link Quality % of any accuracy whatsoever,
 except when the Signal approaches the Noise you know your reception is
 starting to suck.
 
 So 4 points to take out of this:
 1) Drivers SHOULD use subjective values for calculating Quality, but
 that value SHOULD include some sort of RSSI measurement in addition to
 whatever else (ie invalid packets, retransmit count, link speed, etc)
 2) Drivers SHOULD set both current level (ie qual.qual, qual.level) and
 max level (max_qual.qual, max_qual.level)
 3) Drivers SHOULD use same units for level  noise (ie, either RSSI or
 dBm)
 4) Drivers SHOULD use dBm wherever they can, if they can.
 
 Dan

Yes, Dan got the philosophy of the API pretty much spot
on. The qual.qual is the feel good number for most of us, the
qual.level is the measurable characteristics of the radio for the
engineers amongst us, and that explain why the API has both. Only when
you try to have a single measurement you have such a dilemma, but
because we already have qual.level, we are free to be totally
subjective with qual.qual.
I'm personally happy with qual.qual to be driver specific, and
to evolve over time, as long as it give the maximum of feedback to the
user about the link performance. Note that users also have different
thresholds about acceptable performance, mostly based on the
applications they are using (FTP versus VoIP), so they will calibrate
themselves on those numbers.

Signal quality  is defined very briefly in 802.11. Common definitions
  have arisen, but they are usually incorrect. The correct definition
  hinges on the term,  PN code correlation strength,  which is a measure
  of the match (correlation) between the incoming DSSS signal and an ideal
  DSSS signal.

Believe it or not, the obsolete Wavelan driver (pre 802.11)
uses exactly that as qual.qual. However, this definition only work for
true DSSS modulations, which means only for 1 and 2 Mb/s, which means
it's pretty much useless.
I'm sure that for OFDM you could have a measure of quality
based on the pilot symbols, but that would only work for OFDMs bit
rates.

Signal to noise ratio  is a general term that is used in a novel way
  by 802.11 administrators. Most usages of the term refer to the strength
  of the signal relative to thermal noise within a circuit, but 802.11
  administrators use the term to refer to the strength of the signal at
  the receive antenna relative to the ambient, non-802.11 RF energy at the
  same frequency as the signal. While this definition isn t wrong, per se,
  it may lead to confusion when 802.11 administrators communicate with
  engineers who are using the more traditional definition.

Note that the Wireless Extensions never use any SNR, but use
signal and noise as separate measures. I agree with the article, the
signal and the noise are not measured in the same time and in the same
conditions, so are only loosely related.

Have fun...

Jean
___
NetworkManager-list mailing list
NetworkManager-list@gnome.org

Re: CVS-2-2 NMApplet empty bar explained

2005-02-04 Thread Jean Tourrilhes
On Fri, Feb 04, 2005 at 04:32:28PM -0500, Dan Williams wrote:
 On Fri, 2005-02-04 at 09:23 -0800, Jean Tourrilhes wrote:
  MAX_RSSI converted to dBm is some totally uninteresting
  value. It's going to be close to 0 dBm, and a value that has nothing
  much to do with real operation but just an arbitrary limit on the A-D
  converter used for RSSI sampling. On the other hand, MIN_RSSI is very
  relevant, so that's why we use that.
 
 Jean,
 
 Right now, WEXT says that (from the comments in iwlib.c) drivers should
 use a max_qual-level = 0 to specify that the value is in dBm, and
 max_qual-level  0 for RSSI, right?  (ie Absolute and Relative as said
 in iwlib.c/wireless.h)  In other words, the value in max_qual-level
 specifies whether the value in qual-level and qual-noise are signed or
 unsigned, right?  That seems a bit ambiguous to me, easy for driver
 writers to get wrong.

Yes, you are right, most of them actually get it wrong. In
theory, for drivers using dBm, max_qual-level should be the minimum
RSSI, the noise floor, but that's even harder to comprehend, so I've
accepted them using 0.

 (note that qual-qual  max_qual-qual aren't part of this discussion
 since they should always be a percentage value that people can go 100 *
 (qual-qual / max_qual-qual) with)
 
 The problem right now is that drivers don't distinguish between valid 
 invalid values in max_qual-level = 0.  As a person using the driver, I
 don't know if the driver writer _meant_ to put 0 there, to tell me that
 the values the card returns are in dBm, or whether the driver writer
 didn't know _what_ to put there and just left it blank.

Correct, except that the value will always display as dBm, so
you can at least expect that the author meant dBm, otherwise he would
have put 255.

  In some ways
 that's a fix the driver problem to use IW_QUAL_LEVEL_INVALID, but it
 would be nice to have some way to specify what the card means in the
 levels it returns, either dBm or RSSI.  See the prism54 driver for
 instance, they put a simple SNR in the qual-qual field, and stuff RSSI
 into the qual-level field, yet max_qual-level is left at 0.  This is
 wrong IMHO, but if the RSSI-dBm for the card isn't know, what should
 the correct values in qual-level/qual-noise  max_qual-level be?

You can always not convert and put 255 in max_qual-level.

 Either cards are going to return dBm (and hence max_qual-level = 0) or
 RSSI (and hence max_qual-level  0) (if they don't know RSSI-dBm
 conversion tables/values), but at this time we don't know what because
 of the ambiguity surrounding the max_qual-level field.  It would be
 nice to have drivers set this explicitly somewhere...

Ok, I think I'm convinced. Yes, I should add explicit fields
for that.

 Dan

Jean
___
NetworkManager-list mailing list
NetworkManager-list@gnome.org
http://mail.gnome.org/mailman/listinfo/networkmanager-list