CVS-2-2 NMWirelessApplet.c patch
If nm_wireless_qual_to_percent returns 0 for scanned signal strength percentage, the NMApplet progress bar is empty and NMApplet icon/hover shows the link quality same as displayed by iwconfig as desired. I modified the ipw2200 driver so that it returns positive values for max_qual->level and max_qual->noise. It was already set up to return a positive value for qual->level for each AP that is in range. I also modified nm_wireless_qual_to_percent so that it returns a positive percentage for scanned signal strength for AP's in range (discussed in an earlier message). What I expected to see was the NMApplet progress bar showing me the scanned signal strength for all AP's in range and the NMApplet icon/hover showing me the link quality for the associated AP. Instead what I saw was both NMApplet progress bar and NMApplet hover frozen at the same initial scanned value. I was able to obtained the desired behaviour by deleting one line in NMWirelessApplet.c. I changed /* Fall back to old strength if current strength is invalid */ if (strength <= 0) strength = applet->active_device->strength; to /* Display link quality for the active wireless device, not scanned signal strength */ strength = applet->active_device->strength; This is not really falling back to an old strength as the comment suggests. It is falling back to the link quality of the associated AP instead of the scanned signal strength for the associated AP. The function which this line is in is being used to update the NMApplet icon/hover. -- Bill Moss Professor, Mathematical Sciences Clemson University ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: Small problems in Ubuntu hoary.
Colin Walters wrote: Since you're using Ubuntu, you should investigate what their NetworkManager packages (http://www.ubuntulinux.org/wiki/NetworkManager) do, since clearly they have it working. It might make sense for them to submit a patch which conditionally chooses which D-BUS policy to install depending on the target distribution. But it is clearly outdated, and seems to be crashing... Hub ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
NM & applet trouble, does not connect
some observations: NM does not connect to network: sudo service NetworkManager stop Stopping NetworkManager daemon:[ OK ] sudo service NetworkManager start Setting network parameters... Starting NetworkManager daemon:[ OK ] nmtest NM Status: 'scanning' Active device: '/org/freedesktop/NetworkManager/Devices/ath0' Active device name: 'ath0' Devices: /org/freedesktop/NetworkManager/Devices/eth0 Device type: wired /org/freedesktop/NetworkManager/Devices/ath0 Device type: wireless Strength: 44% Active Network: '/org/freedesktop/NetworkManager/Devices/ath0/Networks/XXX' Networks: /org/freedesktop/NetworkManager/Devices/ath0/Networks/XXX (XXX) Strength: 44% but NM does not connect, applet shows radar :-( ./nmtest NM Status: 'scanning' Active device: '/org/freedesktop/NetworkManager/Devices/ath0' Active device name: 'ath0' Devices: /org/freedesktop/NetworkManager/Devices/eth0 Device type: wired /org/freedesktop/NetworkManager/Devices/ath0 Device type: wireless Strength: 53% Active Network: '(null)' Networks: /org/freedesktop/NetworkManager/Devices/ath0/Networks/XXX (XXX) Strength: 44% clicking on the network in the applet does not help either. however, a after "modprobe -r ath_pci" NM actually connects, but still the applet is radar scanning and nmtest says NM Status: 'scanning' Active device: '/org/freedesktop/NetworkManager/Devices/ath0' Active device name: 'ath0' Devices: /org/freedesktop/NetworkManager/Devices/eth0 Device type: wired /org/freedesktop/NetworkManager/Devices/ath0 Device type: wireless Strength: 44% Active Network: '(null)' Networks: /org/freedesktop/NetworkManager/Devices/ath0/Networks/XXX (XXX) Strength: 44% but we are connected!! IP address and all... and: "service NetworkManager restart" seems not to restart NM: Stopping NetworkManager daemon:[ OK ] Setting network parameters... Starting NetworkManager daemon:[ OK ] but: ./nmtest nmwa_dbus_call_nm_method(): org.freedesktop.DBus.Error.ServiceDoesNotExist raised: Service "org.freedesktop.NetworkManager" does not exist NetworkManager appears not to be running (could not get its status). Will exit. now the applet is gone, but: ps auxc|grep Net gives user 5426 0.0 1.7 30708 8928 ?Ssl 17:43 0:00 NetworkManagerI user 5450 0.1 2.1 33356 11040 ? Sl 17:43 0:01 NetworkManagerN root 11365 0.1 0.3 24932 1664 ?Ssl 18:01 0:00 NetworkManager ??? moreover, the device again connects after a "modprobe -r ath_pci", and there is NM specific periodic scanning... Sven ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: signal/link quality wrong?
sorry, small correction (stupid me): max values should be: range->max_qual.qual = 63; /* relative, MAX RSSI ... 60 for some cards */ range->max_qual.level = 0;/* level is in dBm */ range->max_qual.noise = 0; /* noise is in dBm */ see Jean Tourrilhes and Dan Williams posts at mail.gnome.org/archives/networkmanager-list/2005-February/msg00066.html changes in WExt API might happen. Sven On Fri, 2005-02-04 at 15:44 -0500, Sven wrote: > what should be reported by the driver (as in > net/ieee80211_wireless.c) is > > iq->qual = rssi; /* or some calibrated RSSI value */ > if (iq->qual > 63) /* this should be the MAX RSSI ... so it might be 60 > for some cards */ > iq->qual = 63; > iq->noise = -95; /* noise floor in dBm, should be + real noise (needs to > be determined) */ > iq->level = rssi - 95; /* -95 because that converts rssi to dBm. does > noise belong here? */ > > max values should also be set: > range->max_qual.qual = 63; /* relative, MAX RSSI ... 60 for some cards > */ > range->max_qual.level = -32; /* in dBm, MAX RSSI - 95, = -35 for some > */ > range->max_qual.noise = -95; /* noise floor */ ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
On Fri, Feb 04, 2005 at 04:32:28PM -0500, Dan Williams wrote: > On Fri, 2005-02-04 at 09:23 -0800, Jean Tourrilhes wrote: > > MAX_RSSI converted to dBm is some totally uninteresting > > value. It's going to be close to 0 dBm, and a value that has nothing > > much to do with real operation but just an arbitrary limit on the A->D > > converter used for RSSI sampling. On the other hand, MIN_RSSI is very > > relevant, so that's why we use that. > > Jean, > > Right now, WEXT says that (from the comments in iwlib.c) drivers should > use a max_qual->level = 0 to specify that the value is in dBm, and > max_qual->level > 0 for RSSI, right? (ie Absolute and Relative as said > in iwlib.c/wireless.h) In other words, the value in max_qual->level > specifies whether the value in qual->level and qual->noise are signed or > unsigned, right? That seems a bit ambiguous to me, easy for driver > writers to get wrong. Yes, you are right, most of them actually get it wrong. In theory, for drivers using dBm, max_qual->level should be the minimum RSSI, the noise floor, but that's even harder to comprehend, so I've accepted them using 0. > (note that qual->qual & max_qual->qual aren't part of this discussion > since they should always be a percentage value that people can go 100 * > (qual->qual / max_qual->qual) with) > > The problem right now is that drivers don't distinguish between valid & > invalid values in max_qual->level = 0. As a person using the driver, I > don't know if the driver writer _meant_ to put 0 there, to tell me that > the values the card returns are in dBm, or whether the driver writer > didn't know _what_ to put there and just left it blank. Correct, except that the value will always display as dBm, so you can at least expect that the author meant dBm, otherwise he would have put 255. > In some ways > that's a "fix the driver" problem to use IW_QUAL_LEVEL_INVALID, but it > would be nice to have some way to specify what the card means in the > levels it returns, either dBm or RSSI. See the prism54 driver for > instance, they put a simple SNR in the qual->qual field, and stuff RSSI > into the qual->level field, yet max_qual->level is left at 0. This is > wrong IMHO, but if the RSSI->dBm for the card isn't know, what should > the correct values in qual->level/qual->noise & max_qual->level be? You can always not convert and put 255 in max_qual->level. > Either cards are going to return dBm (and hence max_qual->level = 0) or > RSSI (and hence max_qual->level > 0) (if they don't know RSSI->dBm > conversion tables/values), but at this time we don't know what because > of the ambiguity surrounding the max_qual->level field. It would be > nice to have drivers set this explicitly somewhere... Ok, I think I'm convinced. Yes, I should add explicit fields for that. > Dan Jean ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
On Fri, Feb 04, 2005 at 01:47:49PM -0500, Dan Williams wrote: > On Fri, 2005-02-04 at 13:25 -0500, Sven wrote: > > thanks for the info - i have a few more questions/suggestions: > > > > if the card reports RSSI and i know MAX_RSSI, is the relative "signal > > strength" then RSSI / MAX_RSSI (* 100)? if one wants to use that as the > > reported "Link Quality" but for some reason do not want to use %, what > > should one do? converting the RSSI #'s to dBm and using that to compute > > a % is clearly wrong. IMHO some better guidelines as to what "Link > > Quality" in WEXT should mean is desirable. borrowing definitions from > > Joshua Bardwell in > > http://www.connect802.com/download/techpubs/2004/you_believe_D100201.pdf, > > from a practical point of view "Signal Quality," though desirable as > > link quality, is probably not feasible to get a handle on with the > > (current and future) drivers. next best is probably "Signal Strength" - > > from the RSSI values. Or is SNR better as a measurement of link quality? > > but that would require a better reporting of noise by the drivers (and > > not just a hardcoding) . > > RSSI is totally manufacturer dependent. AFAIK, Cisco uses a MAX_RSSI of > 63 for the 340/350, Atheros uses something like 30, etc. It depends on > how many voltage values the hardware can physically measure. So yes, > you do get a sort of Link Quality % when you take RSSI / MAX_RSSI * 100. > You can (and should) augment this value with things like the ipw2200 > driver does, ie receive packet errors, link speed, etc. > > Converting RSSI to dBm and using that for link quality is actually > pretty wrong. dBm is actually useful though, you can do some > interesting things with it like 1) distance from transmitter (if you > know detailed antenna and card characteristics), 2) signal power levels > and noise levels, 3) more accurately test different antennas, etc. Its > just not useful for getting a Link Quality % of any accuracy whatsoever, > except when the Signal approaches the Noise you know your reception is > starting to suck. > > So 4 points to take out of this: > 1) Drivers SHOULD use subjective values for calculating Quality, but > that value SHOULD include some sort of RSSI measurement in addition to > whatever else (ie invalid packets, retransmit count, link speed, etc) > 2) Drivers SHOULD set both current level (ie qual.qual, qual.level) and > max level (max_qual.qual, max_qual.level) > 3) Drivers SHOULD use same units for level & noise (ie, either RSSI or > dBm) > 4) Drivers SHOULD use dBm wherever they can, if they can. > > Dan Yes, Dan got the philosophy of the API pretty much spot on. The qual.qual is the "feel good" number for most of us, the qual.level is the measurable characteristics of the radio for the engineers amongst us, and that explain why the API has both. Only when you try to have a single measurement you have such a dilemma, but because we already have qual.level, we are free to be totally subjective with qual.qual. I'm personally happy with qual.qual to be driver specific, and to evolve over time, as long as it give the maximum of feedback to the user about the link performance. Note that users also have different thresholds about acceptable performance, mostly based on the applications they are using (FTP versus VoIP), so they will calibrate themselves on those numbers. > > " Signal quality is defined very briefly in 802.11. Common definitions > > have arisen, but they are usually incorrect. The correct definition > > hinges on the term, PN code correlation strength, which is a measure > > of the match (correlation) between the incoming DSSS signal and an ideal > > DSSS signal. Believe it or not, the obsolete Wavelan driver (pre 802.11) uses exactly that as qual.qual. However, this definition only work for true DSSS modulations, which means only for 1 and 2 Mb/s, which means it's pretty much useless. I'm sure that for OFDM you could have a measure of quality based on the pilot symbols, but that would only work for OFDMs bit rates. > > " Signal to noise ratio is a general term that is used in a novel way > > by 802.11 administrators. Most usages of the term refer to the strength > > of the signal relative to thermal noise within a circuit, but 802.11 > > administrators use the term to refer to the strength of the signal at > > the receive antenna relative to the ambient, non-802.11 RF energy at the > > same frequency as the signal. While this definition isn t wrong, per se, > > it may lead to confusion when 802.11 administrators communicate with > > engineers who are using the more traditional definition. Note that the Wireless Extensions never use any SNR, but use signal and noise as separate measures. I agree with the article, the signal and the noise are not measured in the same time and in the same conditions, so are only loosely related. Have fun... Jean ___
Re: CVS-2-2 NMApplet empty bar explained
On Fri, 2005-02-04 at 09:23 -0800, Jean Tourrilhes wrote: > MAX_RSSI converted to dBm is some totally uninteresting > value. It's going to be close to 0 dBm, and a value that has nothing > much to do with real operation but just an arbitrary limit on the A->D > converter used for RSSI sampling. On the other hand, MIN_RSSI is very > relevant, so that's why we use that. Jean, Right now, WEXT says that (from the comments in iwlib.c) drivers should use a max_qual->level = 0 to specify that the value is in dBm, and max_qual->level > 0 for RSSI, right? (ie Absolute and Relative as said in iwlib.c/wireless.h) In other words, the value in max_qual->level specifies whether the value in qual->level and qual->noise are signed or unsigned, right? That seems a bit ambiguous to me, easy for driver writers to get wrong. (note that qual->qual & max_qual->qual aren't part of this discussion since they should always be a percentage value that people can go 100 * (qual->qual / max_qual->qual) with) The problem right now is that drivers don't distinguish between valid & invalid values in max_qual->level = 0. As a person using the driver, I don't know if the driver writer _meant_ to put 0 there, to tell me that the values the card returns are in dBm, or whether the driver writer didn't know _what_ to put there and just left it blank. In some ways that's a "fix the driver" problem to use IW_QUAL_LEVEL_INVALID, but it would be nice to have some way to specify what the card means in the levels it returns, either dBm or RSSI. See the prism54 driver for instance, they put a simple SNR in the qual->qual field, and stuff RSSI into the qual->level field, yet max_qual->level is left at 0. This is wrong IMHO, but if the RSSI->dBm for the card isn't know, what should the correct values in qual->level/qual->noise & max_qual->level be? Either cards are going to return dBm (and hence max_qual->level = 0) or RSSI (and hence max_qual->level > 0) (if they don't know RSSI->dBm conversion tables/values), but at this time we don't know what because of the ambiguity surrounding the max_qual->level field. It would be nice to have drivers set this explicitly somewhere... Dan ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: signal/link quality wrong?
hi, here are some suggestions and food for thought (see below for a "patch" proposal). i appreciate any comments and feedback. for the FAQ, a better (more detailed and informative) version of the whitepaper i referred to earlier is actually this: http://www.connect802.com/download/techpubs/2004/you_believe_D100201.pdf as far as i can tell, there _are_ some (admittedly vague) guidelines as to what the "Link Quality" as displayed in iwconfig should be: have a look at wireless.h and iwlib.c in and Wireless Tools or just man iwconfig. what various wireless drivers are reporting is a different issue, but at least one could make an effort to (try to) comply with the standards set in WExt (IMHO they need to be clarified a bit, though). what i have to say is mainly based on the current guidelines in WExt and man iwconfig, and it is restricted to quality stuff. ie, man iwconfig says: Link quality Overall quality of the link. May be based on the level of con- tention or interference, the bit or frame error rate, how good the received signal is, some timing synchronisation, or other hardware metric. This is an aggregate value, and depends totally on the driver and hardware. Signal level Received signal strength (RSSI - how strong the received signal is). May be arbitrary units or dBm, iwconfig uses driver meta information to interpret the raw value given by /proc/net/wire- less and display the proper unit or maximum value (using 8 bit arithmetic). In Ad-Hoc mode, this may be undefined and you should use iwspy. Noise level Background noise level (when no packet is transmitted). Similar comments as for Signal level. FAQ 4.3 states that madwifi currently reports as the link quality RSSI / noise floor. but RSSI is not measured in dBm, and thus the interpretation of "20/96" as "An RSSI of 20dBm relative to a noise floor of 96dBm" is technically incorrect. if i understand Sam (RSSI MAX is 60 or 63) and the whitepaper (Atheros recipe: Subtract 95 from RSSI to derive dBm) correctly, an RSSI of 20 corresponds to 20 - 95 = -75dBm and MAX RSSI is 63 (or 60) - 95 = -32dBm (-35dBm). IMHO a more useful (and currently implementable) specification of "Link Quality" would be "20/60", i.e. RSSI/RSSI_MAX (or % values, ie scaled to 100, but i prefer RSSI/RSSI_MAX), but not dBm values. what the driver really should be reporting, according to my understanding of WExt (and my own opinion), is the "quality of the link," which probably should be the "signal quality." the "signal quality" is very different from the "signal strength" and from SNR (signal-noise-ratio). while either can in some sense be used to give a "Link Quality," signal quality is probably the closest to what is meant by "Link Quality." signal quality is hard to determine, you need to take into account not only the noise level, but also the amount of corruption in the environment between the access point and the client (see above paper for an excellent discussion of this issue). i understand that getting the "signal quality" is difficult and probably currently not really measurable with madwifi (one would have to have a better measure of the noise, errs/corrupted/discarded packages transmitted, data rates...). next best to signal quality is signal strength, which is RSSI / MAX RSSI. given that we _do have_ a relative value, IMHO the driver should report the RSSI / MAX RSSI value instead of a dBm value as qual. i am told the ipw2200 driver does augment this value with things like, ie receive packet errors, link speed, etc. to get a more realistic signal quality. figuring out an "improved" way of getting eg the qual or the level would be a future challenge. in the above example, for quality the driver should be reporting absolute values: iq->qual = 20 (=RSSI) and range->max_qual.qual = 60 (= MAX RSSI). as WExt prefers dBm values (and not RSSI) for level and noise (see below for WExt specs), what should be reported by the driver (as in net/ieee80211_wireless.c) is iq->qual = rssi; /* or some calibrated RSSI value */ if (iq->qual > 63) /* this should be the MAX RSSI ... so it might be 60 for some cards */ iq->qual = 63; iq->noise = -95; /* noise floor in dBm, should be + real noise (needs to be determined) */ iq->level = rssi - 95; /* -95 because that converts rssi to dBm. does noise belong here? */ max values should also be set: range->max_qual.qual = 63; /* relative, MAX RSSI ... 60 for some cards */ range->max_qual.level = -32; /* in dBm, MAX RSSI - 95, = -35 for some */ range->max_qual.noise = -95; /* noise floor */ i know this is very crude, but IMHO this would make madwifi get closer to complying with WExt standards. Sven FYI: /* Quality of link & SNR stuff */ /* Quality range (link, level, noise) * If the quality is absolu
Re: CVS-2-2 NMApplet empty bar explained
On Fri, 2005-02-04 at 13:25 -0500, Sven wrote: > thanks for the info - i have a few more questions/suggestions: > > if the card reports RSSI and i know MAX_RSSI, is the relative "signal > strength" then RSSI / MAX_RSSI (* 100)? if one wants to use that as the > reported "Link Quality" but for some reason do not want to use %, what > should one do? converting the RSSI #'s to dBm and using that to compute > a % is clearly wrong. IMHO some better guidelines as to what "Link > Quality" in WEXT should mean is desirable. borrowing definitions from > Joshua Bardwell in > http://www.connect802.com/download/techpubs/2004/you_believe_D100201.pdf, > from a practical point of view "Signal Quality," though desirable as > link quality, is probably not feasible to get a handle on with the > (current and future) drivers. next best is probably "Signal Strength" - > from the RSSI values. Or is SNR better as a measurement of link quality? > but that would require a better reporting of noise by the drivers (and > not just a hardcoding) . RSSI is totally manufacturer dependent. AFAIK, Cisco uses a MAX_RSSI of 63 for the 340/350, Atheros uses something like 30, etc. It depends on how many voltage values the hardware can physically measure. So yes, you do get a sort of Link Quality % when you take RSSI / MAX_RSSI * 100. You can (and should) augment this value with things like the ipw2200 driver does, ie receive packet errors, link speed, etc. Converting RSSI to dBm and using that for link quality is actually pretty wrong. dBm is actually useful though, you can do some interesting things with it like 1) distance from transmitter (if you know detailed antenna and card characteristics), 2) signal power levels and noise levels, 3) more accurately test different antennas, etc. Its just not useful for getting a Link Quality % of any accuracy whatsoever, except when the Signal approaches the Noise you know your reception is starting to suck. So 4 points to take out of this: 1) Drivers SHOULD use subjective values for calculating Quality, but that value SHOULD include some sort of RSSI measurement in addition to whatever else (ie invalid packets, retransmit count, link speed, etc) 2) Drivers SHOULD set both current level (ie qual.qual, qual.level) and max level (max_qual.qual, max_qual.level) 3) Drivers SHOULD use same units for level & noise (ie, either RSSI or dBm) 4) Drivers SHOULD use dBm wherever they can, if they can. Dan ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
thanks for the info - i have a few more questions/suggestions: if the card reports RSSI and i know MAX_RSSI, is the relative "signal strength" then RSSI / MAX_RSSI (* 100)? if one wants to use that as the reported "Link Quality" but for some reason do not want to use %, what should one do? converting the RSSI #'s to dBm and using that to compute a % is clearly wrong. IMHO some better guidelines as to what "Link Quality" in WEXT should mean is desirable. borrowing definitions from Joshua Bardwell in http://www.connect802.com/download/techpubs/2004/you_believe_D100201.pdf, from a practical point of view "Signal Quality," though desirable as link quality, is probably not feasible to get a handle on with the (current and future) drivers. next best is probably "Signal Strength" - from the RSSI values. Or is SNR better as a measurement of link quality? but that would require a better reporting of noise by the drivers (and not just a hardcoding) . Sven summary/conclusions of the paper mentioned: " Signal strength is defined in 802.11 as Received Signal Strength Indicator (RSSI). RSSI is intended to be used as a relative value within the chipset. It is not associated with any particular mW scale and is not required to be of any particular accuracy or precision. Therefore, the signal strength numbers reported by an 802.11 card will probably not be consistent between two vendors, and should not be assumed to be particularly accurate or precise. " Signal quality is defined very briefly in 802.11. Common definitions have arisen, but they are usually incorrect. The correct definition hinges on the term, PN code correlation strength, which is a measure of the match (correlation) between the incoming DSSS signal and an ideal DSSS signal. The common equation of signal quality and signal to noise ratio is incorrect. " Signal to noise ratio is a general term that is used in a novel way by 802.11 administrators. Most usages of the term refer to the strength of the signal relative to thermal noise within a circuit, but 802.11 administrators use the term to refer to the strength of the signal at the receive antenna relative to the ambient, non-802.11 RF energy at the same frequency as the signal. While this definition isn t wrong, per se, it may lead to confusion when 802.11 administrators communicate with engineers who are using the more traditional definition. On Fri, 2005-02-04 at 09:23 -0800, Jean Tourrilhes wrote: > On Fri, Feb 04, 2005 at 11:10:46AM -0500, Dan Williams wrote: > > On Fri, 2005-02-04 at 10:53 -0500, Sven wrote: > > > iwlib.c in WEXT says (in the example) > > >* 2) value is -54dBm. noise floor of the radio is -104dBm. > > >*qual->value = -54 = 202 ; range->max_qual.value = -104 = 152 > > > i'm confused. why is the max value the noise floor??? in Atheros chips, > > > given the value in dBm comes from RSSI, should max_qual.value not be > > > MAX_RSSI (converted into dBm)? > > > > I think this is actually wrong... What I _think_ it should say is: > > > >* 2) level is -54dBm. noise floor of the radio is -104dBm. > >*qual->level = -54 = 202 ; range->max_qual.noise = -104 = 152 > > Doh ! Stupid bug ! You are right. > > > Noise levels _do_ change dynamically, which is something else that the > > drivers don't do (ahem, atmel, madwifi, and airo for starters). When > > you turn on your microwave, that totally screws the 2.4GHz frequency > > range and impacts 802.11 communications. Since the microwave is random > > energy, it is extra noise and therefore decreases the Signal to Noise > > ratio (ie, the noise value increases, say from -95dBm -> -85dBm, due to > > the extra energy from the microwave, while the signal may stay the > > same). > > Yes. > > > You _always_ have a noise floor, which is the normal value where in good > > conditions the card can no longer distinguish the usable radio energy > > from background energy, but most drivers at this time use that noise > > floor level in the "qual.noise" field and not the "max_qual.noise" > > field, because they evidentally don't sample noise on each channel > > dynamically, or don't know how to pull that value off the card. > > You are correct about the definition of noise floor. This is a > characteristic of the radio and the frequency band, and usually in the > spec (you can't read it from the hw). I don't believe any driver use > the noise floor in qual.noise, that would not make sense. > > > Jean: can you give some clarification on that statement in iwlib.c? > > MAX_RSSI converted to dBm is some totally uninteresting > value. It's going to be close to 0 dBm, and a value that has nothing > much to do with real operation but just an arbitrary limit on the A->D > converter used for RSSI sampling. On the other hand, MIN_RSSI is very > relevant, so that's why we use that. > > > Dan > > Jean ___ NetworkManager-list mailing list Network
Re: CVS-2-2 NMApplet empty bar explained
On Fri, Feb 04, 2005 at 11:10:46AM -0500, Dan Williams wrote: > On Fri, 2005-02-04 at 10:53 -0500, Sven wrote: > > iwlib.c in WEXT says (in the example) > >* 2) value is -54dBm. noise floor of the radio is -104dBm. > >*qual->value = -54 = 202 ; range->max_qual.value = -104 = 152 > > i'm confused. why is the max value the noise floor??? in Atheros chips, > > given the value in dBm comes from RSSI, should max_qual.value not be > > MAX_RSSI (converted into dBm)? > > I think this is actually wrong... What I _think_ it should say is: > >* 2) level is -54dBm. noise floor of the radio is -104dBm. >*qual->level = -54 = 202 ; range->max_qual.noise = -104 = 152 Doh ! Stupid bug ! You are right. > Noise levels _do_ change dynamically, which is something else that the > drivers don't do (ahem, atmel, madwifi, and airo for starters). When > you turn on your microwave, that totally screws the 2.4GHz frequency > range and impacts 802.11 communications. Since the microwave is random > energy, it is extra noise and therefore decreases the Signal to Noise > ratio (ie, the noise value increases, say from -95dBm -> -85dBm, due to > the extra energy from the microwave, while the signal may stay the > same). Yes. > You _always_ have a noise floor, which is the normal value where in good > conditions the card can no longer distinguish the usable radio energy > from background energy, but most drivers at this time use that noise > floor level in the "qual.noise" field and not the "max_qual.noise" > field, because they evidentally don't sample noise on each channel > dynamically, or don't know how to pull that value off the card. You are correct about the definition of noise floor. This is a characteristic of the radio and the frequency band, and usually in the spec (you can't read it from the hw). I don't believe any driver use the noise floor in qual.noise, that would not make sense. > Jean: can you give some clarification on that statement in iwlib.c? MAX_RSSI converted to dBm is some totally uninteresting value. It's going to be close to 0 dBm, and a value that has nothing much to do with real operation but just an arbitrary limit on the A->D converter used for RSSI sampling. On the other hand, MIN_RSSI is very relevant, so that's why we use that. > Dan Jean ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
I have seen my card spike to -15 dBm when held next to an AP. The Cisco lookup table starts at -10 dBm. I agree that [-90, -20]. Is a good range to start with. I think the cards will all output biased values (+256) for qual-> level, max_qual->level, max_qual->noise but I could be wrong. If agreement is ever reached on how to do the associated percentage calculation and the scanned percentage calculation, an appropriate patch could be submitted to each wireless driver project. The ipw2200 people seemed eager to have input on this issue. ipw2200.c is filled with TO DO and FIND A GOOD VALUE comments. The developers are just waiting for someone to bit the bullet. Dan Williams wrote: On Fri, 2005-02-04 at 10:54 -0500, Bill Moss wrote: Take a look at the next message I sent about patching the NM nm_wireless_qual_to_percent function. If a driver does not report max_qual->level and max->qual->noise, make some good guess. The physical lower limit for what a card can do is about -96 dBm. The upper limit varies but is always less than -10 dBm. For the ipw2200, -20 dBm is about right. Percentage is at best a rough indicator and does not directly relate to a physical quantity so I don't think it matters that much to get it "exactly right". If the driver does not specify max_qual->level and max->qual->noise, how about setting max_qual->level to 221 (-35 dBm) max_qual->noise to 161 (-95 dBm) I've seen mention of cards capable of receiving around -80 dBm down to -97 dBm. I think around -90 dBm would be a good value there... I also don't think I've ever seen a card (out of my stack of 6 and a few more from friends and coworkers) that's been able to get above -20 dBm, so that might be a good starter value. We could also adjust this stuff on the fly (at least the upper bound here) and maybe move that down as low as -30dBm over time, though that gets more complicated. I guess the answer would be "fix the driver" if people complained enough. Sounds like a good idea though. Thanks for the tip. Dan -- Bill Moss Professor, Mathematical Sciences Clemson University ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
On Fri, 2005-02-04 at 10:54 -0500, Bill Moss wrote: > Take a look at the next message I sent about patching the NM > nm_wireless_qual_to_percent function. If a driver does not report > max_qual->level and max->qual->noise, make some good guess. The physical > lower limit for what a card can do is about -96 dBm. The upper limit > varies but is always less than -10 dBm. For the ipw2200, -20 dBm is > about right. Percentage is at best a rough indicator and does not > directly relate to a physical quantity so I don't think it matters that > much to get it "exactly right". If the driver does not specify > max_qual->level and max->qual->noise, how about setting > > max_qual->level to 221 (-35 dBm) > max_qual->noise to 161 (-95 dBm) I've seen mention of cards capable of receiving around -80 dBm down to -97 dBm. I think around -90 dBm would be a good value there... I also don't think I've ever seen a card (out of my stack of 6 and a few more from friends and coworkers) that's been able to get above -20 dBm, so that might be a good starter value. We could also adjust this stuff on the fly (at least the upper bound here) and maybe move that down as low as -30dBm over time, though that gets more complicated. I guess the answer would be "fix the driver" if people complained enough. Sounds like a good idea though. Thanks for the tip. Dan ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
On Fri, 2005-02-04 at 10:53 -0500, Sven wrote: > iwlib.c in WEXT says (in the example) >* 2) value is -54dBm. noise floor of the radio is -104dBm. >*qual->value = -54 = 202 ; range->max_qual.value = -104 = 152 > i'm confused. why is the max value the noise floor??? in Atheros chips, > given the value in dBm comes from RSSI, should max_qual.value not be > MAX_RSSI (converted into dBm)? I think this is actually wrong... What I _think_ it should say is: * 2) level is -54dBm. noise floor of the radio is -104dBm. *qual->level = -54 = 202 ; range->max_qual.noise = -104 = 152 Noise levels _do_ change dynamically, which is something else that the drivers don't do (ahem, atmel, madwifi, and airo for starters). When you turn on your microwave, that totally screws the 2.4GHz frequency range and impacts 802.11 communications. Since the microwave is random energy, it is extra noise and therefore decreases the Signal to Noise ratio (ie, the noise value increases, say from -95dBm -> -85dBm, due to the extra energy from the microwave, while the signal may stay the same). You _always_ have a noise floor, which is the normal value where in good conditions the card can no longer distinguish the usable radio energy from background energy, but most drivers at this time use that noise floor level in the "qual.noise" field and not the "max_qual.noise" field, because they evidentally don't sample noise on each channel dynamically, or don't know how to pull that value off the card. Jean: can you give some clarification on that statement in iwlib.c? Dan ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
Take a look at the next message I sent about patching the NM nm_wireless_qual_to_percent function. If a driver does not report max_qual->level and max->qual->noise, make some good guess. The physical lower limit for what a card can do is about -96 dBm. The upper limit varies but is always less than -10 dBm. For the ipw2200, -20 dBm is about right. Percentage is at best a rough indicator and does not directly relate to a physical quantity so I don't think it matters that much to get it "exactly right". If the driver does not specify max_qual->level and max->qual->noise, how about setting max_qual->level to 221 (-35 dBm) max_qual->noise to 161 (-95 dBm) and use a linear model for percentage like the one I put in the patch message. The 70 coefficient may need some adjustment for these values guessed values. The CLAMP function will take care of any overshoot. Dan Williams wrote: On Fri, 2005-02-04 at 01:00 -0500, Bill Moss wrote: Now look at the results for the scanned AP's (list of one here). qual and noise are reported as 0. This is why the CVS-2-2 NMApplet bar is empty for the ipw2200 driver. NM uses the qual value of zero to create the bar length. We can convert the level of 198 to -58 dBm but NM has no idea how ipw2200 would convert this into a percentage. to make matters worse the dBm range varies from driver to driver. We are stuck. There is no way the NM nm_wireless_qual_to_percent function can deal with this. A new nm_wireless_level_to_percent function could be created to do this conversion but it would be guess work at best and impossible to normalize across all drivers. More or less correct... I spent much of last night looking at stuff online trying to figure out ways to be able to use noise & signal levels for something, but without much luck. Drivers simply have to report the noise, and many (like Atheros) hardcode the noise levels. This doesn't work well for the drivers because they have to support a couple different actual chips, and sometimes the same value doesn't necessarily work for all chips. The real solution is to have drivers report their own quality based off RSSI (which all cards MUST have to be 802.11b compliant) and max RSSI. Also, we pretty much need the max_qual.level as well if there's not quality information. As measured on my Netgear WG511T right next to my Linksys WRT54G, that was a fairly consistent -39dBm (saw a -30 once too). However, unless I'm wrong, this varies by access point since each access point puts out different levels of power and have different antennas, both of which affect the actual received signal power on the card. However, using that max_qual.level value, we could take the noise level of -95dBm and we have an upper and lower bound of some sort. Interestingly enough, I found this page that appears to describe what Windows uses, at least for some cards. We may in the end need to do some approximation of this to achieve a signal "quality" measurement from just signal and noise. http://is.med.ohio-state.edu/Wireless%20FAQ.htm Cards use the SNR to determine what speed to drop down to, so some combination of current card speed and SNR may work for us in the absence of actual quality information reported from the driver. Only as a last resort of course. But using card speed doesn't help us for scanned access points, since we're not connected to them :( The search continues. Dan -- Bill Moss Professor, Mathematical Sciences Clemson University ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
Dan: iwlib.c in WEXT says (in the example) * 2) value is -54dBm. noise floor of the radio is -104dBm. *qual->value = -54 = 202 ; range->max_qual.value = -104 = 152 i'm confused. why is the max value the noise floor??? in Atheros chips, given the value in dBm comes from RSSI, should max_qual.value not be MAX_RSSI (converted into dBm)? Sven On Fri, 2005-02-04 at 10:04 -0500, Dan Williams wrote: > card. However, using that max_qual.level value, we could take the noise > level of -95dBm and we have an upper and lower bound of some sort. > ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
CVS-2-2 ipw2200 NMApplet bar empty patch
The function nm_wireless_qual_to_percent has a pointer to the max_qual structure as its second argument. In the ipw2200 driver max_qual->qual = 100; max_qual->level = 0; max_qual->noise=0; I changed this to max_qual->qual = 100; max_qual->level = PERFECT_RSSI + 0x100; max_qual->noise = WORST_RSSI + 0x100; The second two values are biased dBm values. Each driver has its own way to compute percentage from RSSI values. Formulas vary based on the units RSSI is measure in. In some cases a lookup table is used. To get a rough indicator of percentage signal level for use in the NMApplet bar, I modified the nm_wireless_qual_to_percent function as indicated below. Now the NMApplet hover gives percentage values consistent with what I get from iwconfig for the associated AP and the NMApplet bars show for each scanned AP an approximate signal level. My percentage signal level formula is a linear model which underestimates the signal level say in comparison to the values that Windows XP reports. If a driver does not set max_qual->level and max_qual->noise, the nm_wireless_qual_to_percent function could make some guesses. RSSI in dBm is measuring a real physical quantity power. Percentage signal level is a man made device. I wonder what value it really has. If you are doing a building site map say for 80211B, what you really want to do is to find the 11Mbps areas, the 5.5Mpbs areas, the 2Mbps areas, and the 1 Mbps areas. This type of information gives the user an indication of throughput which is more useful than a percent. What the heck does 80% mean in the wireless world across drivers -- next to nothing. /* Try using the card's idea of the signal quality first as long as it tells us what the max quality is */ if ((qual->qual != 0) && (max_qual->qual != 0) && !(max_qual->updated & IW_QUAL_QUAL_INVALID) && !(qual->up dated & IW_QUAL_QUAL_INVALID)) { percent = (int)(100 * ((double)qual->qual / (double)max_qual->qual)); } else { if ((qual->level !=0) && (max_qual->level !=0) && (max_qual->noise !=0)) { percent = (int)(100 - 70*( ((double)max_qual->level - (double)qual->level) / ((double)max_qual->level - (double)max_qual->noise))); } else if ((qual->level > max_qual->level) && (qual->noise != 0)) -- Bill Moss Professor, Mathematical Sciences Clemson University ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list
Re: CVS-2-2 NMApplet empty bar explained
On Fri, 2005-02-04 at 01:00 -0500, Bill Moss wrote: > Now look at the results for the scanned AP's (list of one here). qual > and noise are reported as 0. This is why the CVS-2-2 NMApplet bar is > empty for the ipw2200 driver. NM uses the qual value of zero to create > the bar length. We can convert the level of 198 to -58 dBm but NM has no > idea how ipw2200 would convert this into a percentage. to make matters > worse the dBm range varies from driver to driver. We are stuck. There is > no way the NM nm_wireless_qual_to_percent function can deal with this. A > new nm_wireless_level_to_percent function could be created to do this > conversion but it would be guess work at best and impossible to > normalize across all drivers. More or less correct... I spent much of last night looking at stuff online trying to figure out ways to be able to use noise & signal levels for something, but without much luck. Drivers simply have to report the noise, and many (like Atheros) hardcode the noise levels. This doesn't work well for the drivers because they have to support a couple different actual chips, and sometimes the same value doesn't necessarily work for all chips. The real solution is to have drivers report their own quality based off RSSI (which all cards MUST have to be 802.11b compliant) and max RSSI. Also, we pretty much need the max_qual.level as well if there's not quality information. As measured on my Netgear WG511T right next to my Linksys WRT54G, that was a fairly consistent -39dBm (saw a -30 once too). However, unless I'm wrong, this varies by access point since each access point puts out different levels of power and have different antennas, both of which affect the actual received signal power on the card. However, using that max_qual.level value, we could take the noise level of -95dBm and we have an upper and lower bound of some sort. Interestingly enough, I found this page that appears to describe what Windows uses, at least for some cards. We may in the end need to do some approximation of this to achieve a signal "quality" measurement from just signal and noise. http://is.med.ohio-state.edu/Wireless%20FAQ.htm Cards use the SNR to determine what speed to drop down to, so some combination of current card speed and SNR may work for us in the absence of actual quality information reported from the driver. Only as a last resort of course. But using card speed doesn't help us for scanned access points, since we're not connected to them :( The search continues. Dan ___ NetworkManager-list mailing list NetworkManager-list@gnome.org http://mail.gnome.org/mailman/listinfo/networkmanager-list