I'm restoring my old HP 410B VTVM, and I'm interested in seeing how much the 
resistors have drifted since it was built, particularly the precision resistors 
in the input voltage divider. I don't have volts-nuts caliber equipment (well, 
there is a busted HP 3456A on the shelf waiting to be repaired someday), just a 
Fluke 8050A and a Fluke 27/FM.

I didn't expect to have much trouble making consistent measurements as I don't 
think the 8050A has the resolution to see temperature coefficient changes or 
thermocouple effects. But I'm seeing some odd results on the higher resistance 
values. First, I seem to see some contact resistance effects: I don't get 
consistent measurements just using the probes as the count varies a little with 
contact pressure and probe placement. The contact resistance would have to vary 
by thousands of ohms for it to affect the meter; I can't believe that could be 
the explanation. However, I'm able to get consistent measurements by slipping 
alligator clips on to the probe tips and clipping on to the range switch 
terminals. Maybe the old solder is so oxidized that the contact resistance can 
really vary that much?

Second, the Fluke 27/FM measurements track those of the 8050A better than the 
spec'd limits, but I see some odd behavior in the last digit of the 8050A. The 
last digit of the resistance value varies with the direction of the current 
through the resistor, and in one direction, it bobbles up and down about three 
counts. In the other direction, the reading is stable. The bobble doesn't seem 
to be sensitive to placements of the test leads.

For example, the 6.837M Ω 1% resistor (R6) measures 7.037M Ω one way, and 
between about 6.994M and 6.996M Ω when I reverse the 8050A test leads. That's a 
difference of nearly 0.6%

The 2.163M Ω 1% resistor (R5) measures 2.220M Ω one way, and between about 
2.215M and 2.217M Ω when I reverse the leads, for a difference of about 0.2%.

The 683.7K Ω 1% resistor (R4) measures 697.9K Ω one way, and between about 
697.5K and 697.7K Ω when I reserve the leads, for a difference of about 0.05%.

I did try switching off some potential nearby RFI sources - fluorescent lights, 
switching power supply, laptop computer - and saw no difference in behavior, 
although I didn't do an exhaustive search for RFI.

Finally, I did some quicky measurements with the Fluke 27/FM about two months 
ago, and the current measurements seem to be a bit off (I don't have the old 
recorded measurements handy as I write this, but I think they are outside the 
accuracy limits spec'd for the 27/FM). This is a non-climate-controlled New 
England basement, so the temperature is probably up about 5 degrees C and the 
humidity has shot up recently. But again, I wouldn't think my instruments are 
good enough to notice these environmental effects on the components themselves.


Any ideas as to what's going on? How can I improve my measurement procedure to 
get repeatable results? Do I really need better climate control even at the 3 
1/2 or 4 1/2 digit level of precision? What's with the polarity sensitivity of 
the 8050A resistance measurements? Suggestions and advice would be gratefully 
accepted.

Best regards,
-Steve

-- 
Steve Byan <[email protected]>
Littleton, MA 01460



_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Reply via email to