Gordon wrote:

Pure conjecture: So that the reading on the 34401A matches that on a $20 DVM.
I assume you mean when the DVM is disconnected - otherwise you wouldn't spend more than $20 on a meter! But I said that in my original post:

   /So why would they do this? Could it be psychological? By limiting
   the drift caused by the i/p bias current to 300uV max when the meter
   is left unconnected? A voltmeter with a rapidly drifting reading
   (several mV/s) when not connected to anything is a bit disconcerting
   and would *probably lead to complaints that the meter is obviously
   faulty to users who are used to DVMs which read 0V when open
   circuit* - because they have i/p resistance << 10G ohms and don't
   have the resolution to show the offset voltage caused by the i/p
   bias current.////
   /

Or stated differently: So that the input impedance is the same as other DVMs.

Not really - that's a different reason. Other meters have a variety of different input resistances but 10M is probably the most common however. In any case, with the exception of matching the needs of a HV probe, the higher the input resistance the better. Deliberately compromising the performance to match cheaper models and making it harder than necessary (a sequence of 9 button presses!) to de-select that error source, seems to be a bizzare choice.

Tony H

Brent

On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V range so why would they make 10M ohm the default? I can think of very few cases where having the 10M ohm i/p resistor switched in is better for accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% error - 502.488mV instead of 500.000mV. That might not be a problem but I wouldn't be surprised if this catches a lot of people out (including me) when not pausing to do the mental arithmetic to estimate the error. It's just too easy to be seduced by all those digits into thinking you've made an accurate measurement even though you discarded those last three digits.

And if it's not a problem then you probably don't need an expensive 6 1/2 digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep going into the measurement menus to change it when the meter is turned on when measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any significant difference to ESD and in any case there is plenty of other over-voltage protection. OK. it provides a path for the DC amplifier's input bias current, specified to be < 30pA at 25 degrees C, but I imagine that varies significantly from one meter to the next, and with temperature, so not useful for nulling out that error.

So why would they do this?

_______________________________________________
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


_______________________________________________
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Reply via email to