Nice post, Gabriel.  I really liked the reminder that what we call 
     precision doesn't necessarily mean something.  One dB is about 10 
     percent.  That's not really very good -- but for RF it's about what we 
     can get, generally.
     
     On the other hand, knowing the precise frequency -- down to KHz -- is 
     often helpful, especially where one must discrimina
     te from among several different sources close to each other
     .

Cheers,
     

     
Cortland
     


______
__
__
____________________ Reply Separator _________________________________
Subject:
e: Measurement Uncertainty
Author:  Gabriel Roy/HNS <gabriel_...@notesgw.hns.com> at internet
List-Post: emc-pstc@listserv.ieee.org
Date:    1/13/97 13:18


Michael Barge asks: 
          
>(1)  Do most labs report an uncertainty measurement in the test report, on 
the data sheet, on a certificate of compliance?
>(2)  How did you generate the measurement of uncertainty for emission 
tests? For immunity tests?
          
> AND MOST IMPORTANTLY
          
>(3)  What do you tell the customer when he is below the limit by less than 
the measurement uncertainty? When he is above by less?
          
---------------------------------------------------------------- 
          
Well Michael, the lab that I do most of my business with does not report the 
measurement of uncertainty in my reports, and I am very satisfied that they do 
not. 
          
I certainly do not include a value in the declaration of conformity, nor do I 
intend to ever. It would be ludicrous for our declaration to  state that we are 
96% confident that we comply. Either we comply or we don't. 
          
In my opinion, the measurement uncertainty is unnecessary and its inclusion can 
only lead to stricter limits being enforced. Enforced by the unknowing in the 
beginning and eventually by decree "because that's the way it's done now".
          
I have delt with a lab that included the measurement of uncertainty in their 
report, it was 2 dB. And if we were closer than 2 dB to the limit they 
considered it a failure. Evidently, because their instrumentation was not 
sufficiently accurate to give them measurements that they could be confident 
in, our equiment suffered. I don't use that lab any more. 
          
My contention is that this measurement uncertainty is a lot of hog wash. The 
same as reporting frequencies to three decimal places and dB's to two decimal 
places. That's missing the original intent of the requirement, which was to not 
interfere with other's communications. The method used originally was a radio 
receiver, with front mounted rotary switches that dropped attennuation pads 
into the receive circuit (i.e. see the older Rhode & Schartz used by VDE).  The 
level was read as the amount of attennuation needed to bring the incoming 
signal to a null. Thus one either passed or failed, with no reguard to 
fractions of dBs. As far as the frequency was concerned, it was read off a log 
scale similar to a radio tuner (before digital readouts), and one was lucky to 
get the reading down to 10 MHz in the upper scale. 
          
It wasn't until the introduction of the spectrum analyzer by HP around 1979-80 
that we could get readings down to three decimal places. Why did HP pick three? 
If they had picked five, lab assistants and engineers would now be blindly 
writing values down to five decimal places. It doesn't make it more accurate, 
it only gives a feeling of being very scientific. One can get the diameter of a 
circle by using 3.14 for pie (?) just as well as using pie to 27 decimal 
places. 
          
Proponents of the measurement uncertainty practice should take a look at the way
that the VDE engineers write down their measurements. They have a sheet of log 
paper and a pencil. They put the point of the pencil ot their best approximation
of frequency and dB level on the log paper and make a large dot. They then pull 
down on the dot to make a tail, so that it will be easier to identify. Again, 
you either pass or you fail. The size of the pencil dot and the way the VDE 
engineer places the pencil on the paper greatly overshadows any measurement 
uncertainty there might have been in the system. 
          
One final comment, our instumentation has always been accurate, and we have 
always known how acurate it is. The accuracy is specified by the manufacturer 
and the instrumentation is kept within calibration. Also the regulatory bodies 
have been satisfied with that process. I have a hunch, and it is only a hunch, 
that this measurement uncertainty was developed by an EMI engineer who had no 
way of coming up with the reason for getting different measurements at 
different sites. The old "It passes at home by 2 dB and fails at the lab by 3 
dB" syndrome. So in an attemp to explain EMI in logical terms (!!!) to his 
boss, he lumpted the variations of all factors (cable losses, antenna factors, 
amplifiers, spectrum analyzer, preselector, attennuating pads, connector lo 
sses, site attennuations, etc.) and came up with a fudge factor which he called 
"measurement uncertainty". Now he could go to his boss and explain why his 
measurements at home were different than the measurements at the test site. 
          
But I bet that he still had to fix his equipment. 
          
Enough ranting. 
          
Cheers, 
          
Gabriel Roy
Hughes Network Systems, MD
          
It's pretty obvious that the opinions are my own. Jim Eickler's invisible 
friend doesn't even talk to me any more. 

Reply via email to