Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread Tony

On 11/04/2014 20:59, Poul-Henning Kamp wrote:

In message <534840da.4070...@toneh.demon.co.uk>, Tony writes:


Many of the attributes of the input circuit are not there for
voltage metrology.


I can't think of any - I'd be interested to know what you have in mind?

One of the articles in HP Journal is specifically about how the
input circuit is designed for use also in sampling AC measurements
and digitizer applications.

Ok, thanks - that looks like it might be an interesting read - will go 
and try and find it.


Tony H
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread Poul-Henning Kamp
In message <534840da.4070...@toneh.demon.co.uk>, Tony writes:

>> Many of the attributes of the input circuit are not there for
>> voltage metrology.
>>
>I can't think of any - I'd be interested to know what you have in mind?

One of the articles in HP Journal is specifically about how the
input circuit is designed for use also in sampling AC measurements
and digitizer applications.

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread Tony

Hi Paul,

On 11/04/2014 19:18, Poul-Henning Kamp wrote:

In message <53482d9e.9060...@toneh.demon.co.uk>, Tony writes:


I did find this explanation in the 3458A manual:

When making DC voltage measurements, you can fix the multimeter's input
resistance using the FlXEDZ command. This is useful to prevent a change
in input resistance caused by changing ranges) from affecting the
measurements.

Please don't forget that the 3458A is not just an 8.5 digit
metrological wonder, it is also a precision 16 bit 100 ksample/s
digitizer.
Yes I'm well aware of that but it doesn't make any difference if you're 
observing the display or the logged data - you would not want that 10M ± 
1% resistor across the input - unless you have no choice because the 
input exceeds 12V and you have to use a 100V+ range, in which case you 
have to accept the limitations of the instrument. I accept that it could 
help in some situations where you aren't bothered about absolute 
measurements but are interested in changes and the resistor helps to 
reduce the noise.


In such a situation I agree you would probably be logging the data 
rather than looking at the display but the sampling rate has no bearing 
at all on this issue.

Many of the attributes of the input circuit are not there for
voltage metrology.


I can't think of any - I'd be interested to know what you have in mind?

Regards,
  Tony H

___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread WarrenS



"You can't have your cake and eat it!"

well said

Too true, that the 10 MΩ reduces the noise pickup, but **only** when the DVM 
input is **open circuited**, which is not so good for measuring anything if 
the signal is not connected.
Otherwise the noise pickup is a function of the impedance of the signal 
being measured, not the DVM's input impedance.
If you want the lowest noise pickup,  keep the DVM's input shorted. This is 
the best case for open circuit noise pickup & the worse case load when 
measuring anything.
If you want less than the 1ppm loading effect when using a 6 digit plus DVM, 
insure that it's input impedance  is > 1,000,000 times the signal's source 
impedance, which is worse case for open circuit noise pickup.


Anyone that thinks a 10 MΩ input on a precision DVM is a good thing, try 
using it to measuring a standard cell or the 1 volt, 1KΩ  reference voltage 
output from a Fluke 731 or 732, and see what that does.


ws




On 11/04/2014 10:53, frank.stellmach at freenet.de wrote:

Hello

In the manual (!), HP reasons the 10M standard input resistance:

"Normally, the multimeter’s input resistance is fixed at 10 MΩ for all dc 
voltage ranges to minimize noise pickup."

Oops! I'm ashamed to say I missed that!
  I explained that to myself like this: AC stray fields or noisy high 
impedance sources induce noise input currents in the DMM frontend.

The higher its input Z, the higher the noise voltage reading will be.
In that sense, 10MOhm 'shorts' those noise effects.




...
You can't have your cake and eat it! You might be 'shorting' the noise
but you're equally shorting your signal; if you've got that much noise
then you've got some potentially difficult signal conditioning to do if
you want to make accurate measurements. 


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread Poul-Henning Kamp
In message <53482d9e.9060...@toneh.demon.co.uk>, Tony writes:

> I did find this explanation in the 3458A manual:
> 
> When making DC voltage measurements, you can fix the multimeter's input 
> resistance using the FlXEDZ command. This is useful to prevent a change 
> in input resistance caused by changing ranges) from affecting the 
> measurements.

Please don't forget that the 3458A is not just an 8.5 digit
metrological wonder, it is also a precision 16 bit 100 ksample/s
digitizer.

Many of the attributes of the input circuit are not there for
voltage metrology.

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-11 Thread Tony

Hi,

On 11/04/2014 10:53, frank.stellm...@freenet.de wrote:

Hello
  
In the manual (!), HP reasons the 10M standard input resistance:
  
"Normally, the multimeter’s input resistance is fixed at 10 MΩ for all dc voltage ranges to minimize noise pickup."

Oops! I'm ashamed to say I missed that!

  I explained that to myself like this: AC stray fields or noisy high impedance 
sources induce noise input currents in the DMM frontend.
The higher its input Z, the higher the noise voltage reading will be.
In that sense, 10MOhm 'shorts' those noise effects.



Frank, I realise you know all the following perfectly well - so please 
don't take offense, but hopefully, for the benefits of others:


You can't have your cake and eat it! You might be 'shorting' the noise 
but you're equally shorting your signal; if you've got that much noise 
then you've got some potentially difficult signal conditioning to do if 
you want to make accurate measurements. Sticking a 1% tolerance 10M 
across the signal to cut the noise makes no sense at all - it's only 
going to make a significant difference to the noise if the source 
resistance is > 10M or so, in which case why are you using a 6-1/2 or 
even 8-1/2 digit meter? You might as well use a 1 digit meter - at least 
that wouldn't give any false illusions of accuracy. Wouldn't a capacitor 
(with a resistance >> 10M) across the input make more sense? If the test 
leads are long then 10M at the meter is going to reduce the noise more 
than 10M at the source, but the problem remains.


In the case of the 3458A using the 10M I/P resistance requires the 
source resistance to be < 100 milli-ohms for it not to show errors on 
the 8-1/2 digit scales - so OK. for measuring voltage regulators, 
batteries etc. but very little else including precision voltage 
references or the O/P of an op-amp etc. The 3458A does default to the 
high I/P resistance state though so you have to choose to change it to 
10M, unlike the 34401A which defaults it to on.


Even if you do know the precise resistance of the source allowing you to 
calculate the actual source voltage, the 1% tolerance of the 10M 
restricts the accuracy to 2 or 3 digits at best - so again it makes no 
sense to me to be relying on a non-precision resistor on precision 
instruments - even the 3458A's 10M has a 1% tolerance. You could measure 
the I/P resistance of your meter with another 6-1/2 or 8-1/2 digit meter 
but how many people do that? And how are you going to measure the source 
resistance accurately? What if its non-linear?


If you're relying on that 10M, chances are you're using the wrong 
instrument.


I did find this explanation in the 3458A manual:/

When making DC voltage measurements, you can fix the multimeter's input 
resistance using the FlXEDZ command. This is useful to prevent a change 
in input resistance caused by changing ranges) from affecting the 
measurements//

/
A bit feeble though. If your measurement changes when changing ranges 
that is a *good* thing - it highlights that you've  got a problem and 
you need to think about it and decide which, if either, measurement is 
accurate rather than hide the problem by showing consistent, but 
consistently wrong readings. The a rather special case anyway, 
applicable to I/P voltages at the boundary between the 10 and 100V range 
- if its less than 10V (well 12V on the 34401A/3458A) then you've no 
reason to use a higher range.


I suppose it could be useful to observe a signal where you aren't 
concerned about the absolute accuracy but want high resolution relative 
measurements - eg. drift measurements or measuring the linearity of a 
slope. You need to know though that the 10M is not going to move the 
high impedance source into a non-linear region.


It looks like this is a HP feature; looking at the manuals for the 
Solartron 7081 and Datron 1071 meters they don't appear to be have the 
facility to lower the I/P resistance from >= 10G for ranges below 100V.



Anyhow, my 3458A is always programmed to have TOhm input Z as power up state, 
therefore I have to short the input jacks, whenever the instrument is not 
connected, or doing an ACAL. That avoids drifting of the input and unwanted 
relays actuations, when left open.
I suppose that could be a slight concern although my 34401A  doesn't 
seem to drift much beyond 200mV - I guess it could sit around the 120mV 
switching threshold of the 100mV and 1V ranges and keep switching 
between them. I've never seen it though, and wouIdn't be concerned about 
the relay life even if it did (although the noise might be irritating).


On the 34401A, it is very uncomfortable to set the input to high Z every time, 
as many keypresses are needed.

So agilent recognized that dilemma, and improved the handling on the new 
34461A:
Changing input Z is only one additional keypress, as this feature is assigned 
to one of the soft keys.
  
Frank



That's good to see. I think they just got it wrong on the 34401A

Tony H



Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread ed breya
Only specialized meters can provide virtually infinite input R at 
voltages above the 10 to 20 V or so native range of conventional 
amplifiers, so you have to use some kind of attenuator to cover the 
higher ranges anyway. 10 megs and 1 meg (and sometimes 11) are the 
traditional values used, with 10 of course providing less loading on 
the signal source. It is difficult to get good resistor precision, 
stability, and voltage coefficient at higher values, so 10 megs is a 
good compromise.


As far as I know, no DVM uses an actual 10 or 1 gigaohm resistor for 
its input termination - that's just an equivalent input R range 
(sometimes just a part spec right from the datasheet) for the high 
impedance opamps and JFET circuits typically used to amplify DC at 
reasonably high accuracy and low noise. All it means is that it is 
nearly non-loading to conventional DC circuits. If there is an actual 
resistor this large in there, then it is just to get the input near 
zero when it's disconnected - it will read the bias current times the 
resistance, which can be quite large. If the applied input voltage 
exceeds the native range, the protection circuitry will take over.


For ultra-high Z applications, the equivalent input R would need to 
be in the teraohm range instead, using electrometer-class opamps, 
with much lower bias current, but higher offset voltage and noise.


If you put DVMs in the low ranges below 10 or 20 V, ones without 
actual termination R will tend to drift off due to input bias 
current. Once it's connected, the effect is much smaller (but not 
zero) since the source R is usually comparatively very small. One way 
to always assure a zero reading is to have a definite and fairly low 
(to not show bias current too obviously) input R, so there's the 10 
megs option. It's also possible to make the actual value of the input 
R (and not just the dividing ratios) very precise - or measure it - 
so that its effect on measurements at known source resistances can be 
figured out.


As you have already figured out, in auto-ranging, a non-terminated 
DVM left disconnected and unattended will form a relaxation 
oscillator and tend to wear out its front-end relays. Seeing no 
signal in the higher ranges, the system will switch down to the lower 
ranges and be OK until the input drifts off to a range limit, then it 
will up-range until it reaches one with an attenuator, then the 
signal goes back to zero, and the process repeats.


Ed

At 07:23 AM 4/10/2014, you wrote:
There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 
10V range so why would they make 10M ohm the default? I can think of 
very few cases where having the 10M ohm i/p resistor switched  in is 
better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant 
errors on a 6 1/2 digit DVM for sources with resistances as low as 
10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
causes a .5% error - 502.488mV instead of 500.000mV. That might not 
be a problem but I wouldn't be surprised if this catches a lot of 
people out (including me) when not pausing to do the mental 
arithmetic to estimate the error. It's just too easy to be seduced 
by all those digits into thinking you've made an accurate 
measurement even though you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 
6 1/2 digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is 
turned on when measuring high impedance sources (e.g. capacitor 
leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of 
other over-voltage protection. OK. it provides a path for the  DC 
amplifier's input bias current, specified to be < 30pA at 25 degrees 
C, but I imagine that varies significantly from one meter to the 
next, and with temperature, so not useful for nulling out that error.


So why would they do this? Could it be psychological? By limiting 
the drift caused by the i/p bias current to 300uV max when the meter 
is left unconnected? A voltmeter with a rapidly drifting reading 
(several mV/s) when not connected to anything is a bit disconcerting 
and would probably lead to complaints that the meter is obviously 
faulty to users who are used to DVMs which read 0V when open circuit 
- because they have i/p resistance << 10G ohms and don't have the 
resolution to show the offset voltage caused by the i/p bias current.


Personally I'd have though that the default should be the other way 
round - especially given that there is no indication on the front 
panel or display as to which i/p resistance is currently selected.


Any thoughts? What do other meters do?

Tony H
_

Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread ed breya
Only specialized meters can provide virtually infinite input R at 
voltages above the 10 to 20 V or so native range of conventional 
amplifiers, so you have to use some kind of attenuator to cover the 
higher ranges anyway. 10 megs and 1 meg (and sometimes 11) are the 
traditional values used, with 10 of course providing less loading on 
the signal source. It is difficult to get good resistor precision, 
stability, and voltage coefficient at higher values, so 10 megs is a 
good compromise.


As far as I know, no DVM uses an actual 10 or 1 gigaohm resistor for 
its input termination - that's just an equivalent input R range 
(sometimes just a part spec right from the datasheet) for the high 
impedance opamps and JFET circuits typically used to amplify DC at 
reasonably high accuracy and low noise. All it means is that it is 
nearly non-loading to conventional DC circuits. If there is an actual 
resistor this large in there, then it is just to get the input near 
zero when it's disconnected - it will read the bias current times the 
resistance, which can be quite large. If the applied input voltage 
exceeds the native range, the protection circuitry will take over.


For ultra-high Z applications, the equivalent input R would need to 
be in the teraohm range instead, using electrometer-class opamps, 
with much lower bias current, but higher offset voltage and noise.


If you put DVMs in the low ranges below 10 or 20 V, ones without 
actual termination R will tend to drift off due to input bias 
current. Once it's connected, the effect is much smaller (but not 
zero) since the source R is usually comparatively very small. One way 
to always assure a zero reading is to have a definite and fairly low 
(to not show bias current too obviously) input R, so there's the 10 
megs option. It's also possible to make the actual value of the input 
R (and not just the dividing ratios) very precise - or measure it - 
so that its effect on measurements at known source resistances can be 
figured out.


As you have already figured out, in auto-ranging, a non-terminated 
DVM left disconnected and unattended will form a relaxation 
oscillator and tend to wear out its front-end relays. Seeing no 
signal in the higher ranges, the system will switch down to the lower 
ranges and be OK until the input drifts off to a range limit, then it 
will up-range until it reaches one with an attenuator, then the 
signal goes back to zero, and the process repeats.


Ed

At 07:23 AM 4/10/2014, you wrote:
There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 
10V range so why would they make 10M ohm the default? I can think of 
very few cases where having the 10M ohm i/p resistor switched  in is 
better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant 
errors on a 6 1/2 digit DVM for sources with resistances as low as 
10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
causes a .5% error - 502.488mV instead of 500.000mV. That might not 
be a problem but I wouldn't be surprised if this catches a lot of 
people out (including me) when not pausing to do the mental 
arithmetic to estimate the error. It's just too easy to be seduced 
by all those digits into thinking you've made an accurate 
measurement even though you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 
6 1/2 digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is 
turned on when measuring high impedance sources (e.g. capacitor 
leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of 
other over-voltage protection. OK. it provides a path for the  DC 
amplifier's input bias current, specified to be < 30pA at 25 degrees 
C, but I imagine that varies significantly from one meter to the 
next, and with temperature, so not useful for nulling out that error.


So why would they do this? Could it be psychological? By limiting 
the drift caused by the i/p bias current to 300uV max when the meter 
is left unconnected? A voltmeter with a rapidly drifting reading 
(several mV/s) when not connected to anything is a bit disconcerting 
and would probably lead to complaints that the meter is obviously 
faulty to users who are used to DVMs which read 0V when open circuit 
- because they have i/p resistance << 10G ohms and don't have the 
resolution to show the offset voltage caused by the i/p bias current.


Personally I'd have though that the default should be the other way 
round - especially given that there is no indication on the front 
panel or display as to which i/p resistance is currently selected.


Any thoughts? What do other meters do?

Tony H
_

Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Brent Gordon


On 4/10/2014 3:18 PM, Tony wrote:

   Gordon wrote:

Pure conjecture:  So that the reading on the 34401A matches that on a 
$20 DVM.
I assume you mean when the DVM is disconnected - otherwise you 
wouldn't spend more than $20 on a meter! But I said that in my 
original post:
Actually, I meant while connected.  For example, you want to do a 
quick-and-dirty check on a voltage.  Say you had previously set the 
voltage to a specific value using the 34401A .  Sometime later you want 
to check the voltage but the 34401A is not available.  So, you use your 
10 M-Ohm Fluke handheld DVM.  If your source impedance is high and you 
set the voltage with the 34401A set to 10 G-Ohm you would get a 
significantly different reading with the Fluke.  If the voltage was set 
with 34401A in 10 M-Ohm mode, you should get the same reading (within 
the accuracy of the Fluke).


Brent
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tom Miller
Don't forget. There is accuracy and then precision. You should not confuse 
the two.


And many things use high voltages >1kv besides old crts.

T

- Original Message - 
From: "Brent Gordon" 

To: "Discussion of precise voltage measurement" 
Sent: Thursday, April 10, 2014 4:16 PM
Subject: Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?


Pure conjecture:  So that the reading on the 34401A matches that on a $20 
DVM.


Or stated differently:  So that the input impedance is the same as other 
DVMs.


Brent

On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V 
range so why would they make 10M ohm the default? I can think of very few 
cases where having the 10M ohm i/p resistor switched  in is better for 
accuracy than not.


On the other hand 10M is sufficiently low to produce significant errors 
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. 
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% 
error - 502.488mV instead of 500.000mV. That might not be a problem but I 
wouldn't be surprised if this catches a lot of people out (including me) 
when not pausing to do the mental arithmetic to estimate the error. It's 
just too easy to be seduced by all those digits into thinking you've made 
an accurate measurement even though you discarded those last three 
digits.


And if it's not a problem then you probably don't need an expensive 6 1/2 
digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is turned on 
when measuring high impedance sources (e.g. capacitor leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of other 
over-voltage protection. OK. it provides a path for the DC amplifier's 
input bias current, specified to be < 30pA at 25 degrees C, but I imagine 
that varies significantly from one meter to the next, and with 
temperature, so not useful for nulling out that error.


So why would they do this?


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to 
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there. 


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tony

   Gordon wrote:

Pure conjecture:  So that the reading on the 34401A matches that on a 
$20 DVM.
I assume you mean when the DVM is disconnected - otherwise you wouldn't 
spend more than $20 on a meter! But I said that in my original post:


   /So why would they do this? Could it be psychological? By limiting
   the drift caused by the i/p bias current to 300uV max when the meter
   is left unconnected? A voltmeter with a rapidly drifting reading
   (several mV/s) when not connected to anything is a bit disconcerting
   and would *probably lead to complaints that the meter is obviously
   faulty to users who are used to DVMs which read 0V when open
   circuit* - because they have i/p resistance << 10G ohms and don't
   have the resolution to show the offset voltage caused by the i/p
   bias current.
   /

Or stated differently:  So that the input impedance is the same as 
other DVMs.


Not really - that's a different reason. Other meters have a variety of 
different input resistances but 10M is probably the most common however. 
In any case, with the exception of matching the needs of a HV probe, the 
higher the input resistance the better. Deliberately compromising the 
performance to match cheaper models and making it harder than necessary 
(a sequence of 9 button presses!) to de-select that error source, seems 
to be a bizzare choice.


Tony H


Brent

On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 
10V range so why would they make 10M ohm the default? I can think of 
very few cases where having the 10M ohm i/p resistor switched  in is 
better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant 
errors on a 6 1/2 digit DVM for sources with resistances as low as 10 
ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
causes a .5% error - 502.488mV instead of 500.000mV. That might not 
be a problem but I wouldn't be surprised if this catches a lot of 
people out (including me) when not pausing to do the mental 
arithmetic to estimate the error. It's just too easy to be seduced by 
all those digits into thinking you've made an accurate measurement 
even though you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 6 
1/2 digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is 
turned on when measuring high impedance sources (e.g. capacitor 
leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of 
other over-voltage protection. OK. it provides a path for the DC 
amplifier's input bias current, specified to be < 30pA at 25 degrees 
C, but I imagine that varies significantly from one meter to the 
next, and with temperature, so not useful for nulling out that error.


So why would they do this?


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to 
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts

and follow the instructions there.



___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread John Phillips
With the 1K ohm per volt you need to know what range you are using. You do
have to know your meter and know how to correct for loading or not loading.
It is not very practical to have a bunch of different input standards 10M
works for a lot of things and is the the standard voltage divider.


On Thu, Apr 10, 2014 at 1:59 PM, Poul-Henning Kamp wrote:

> In message <534704f7.3030...@toneh.demon.co.uk>, Tony writes:
>
> >Very unlikely I'd have thought - the relay (K104) which selects between
> >the high and low voltage ranges also selects the  I/P resistance. It
> >wouldn't get used any more than the identical relay |(K102) which
> >switches when changing between 10 and 100V ranges.
>
> If you leave your 34401A on 10G input with nothing connected in
> a dry atmosphere, it is going to build up charge and trigger the
> autorange relay.  Anti-wear-out mechanisms are certainly relevant.
>
> A similar mechnism is documented in one of the 3458A manuals.
>
> --
> Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
> p...@freebsd.org | TCP/IP since RFC 956
> FreeBSD committer   | BSD since 4.3-tahoe
> Never attribute to malice what can adequately be explained by incompetence.
> ___
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.
>



-- 
John Phillips
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Poul-Henning Kamp
In message <534704f7.3030...@toneh.demon.co.uk>, Tony writes:

>Very unlikely I'd have thought - the relay (K104) which selects between 
>the high and low voltage ranges also selects the  I/P resistance. It 
>wouldn't get used any more than the identical relay |(K102) which 
>switches when changing between 10 and 100V ranges.

If you leave your 34401A on 10G input with nothing connected in
a dry atmosphere, it is going to build up charge and trigger the
autorange relay.  Anti-wear-out mechanisms are certainly relevant.

A similar mechnism is documented in one of the 3458A manuals.

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Brooke Clarke

Hi John:

Because when measuring a source with a high resistance you get a different 
answer.
Some W.W.II electronics specified 1 kOhm/Volt meters and if you used a VTVM you 
got the wrong results.
If a test procedure specifies a 10MOhm input meter and you use a higher input Z 
then you may get wrong results.

Have Fun,

Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html

John Phillips wrote:

so why do you care what the input is as long as you know what it is and how
to make it do what you want?


On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon wrote:


Pure conjecture:  So that the reading on the 34401A matches that on a $20
DVM.

Or stated differently:  So that the input impedance is the same as other
DVMs.

Brent


On 4/10/2014 8:23 AM, Tony wrote:


There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched  in is better for
accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three digits.

And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned on
when measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with temperature,
so not useful for nulling out that error.

So why would they do this?


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/volt-nuts
and follow the instructions there.






___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tony
Very unlikely I'd have thought - the relay (K104) which selects between 
the high and low voltage ranges also selects the  I/P resistance. It 
wouldn't get used any more than the identical relay |(K102) which 
switches when changing between 10 and 100V ranges.


In any case a typical signal relay is rated for 10^8 operations 
(typical) with no load, as in this application, which is 32 years at one 
operation per second! How often does even a heavily used DVM change 
between the 10 and 100V ranges?


On 10/04/2014 20:27, Andreas Jahn wrote:

Hello,

perhaps its just to save the lifetime of the input range selection 
relays to at least the warranty time.

Just a guess.

With best regards

Andreas


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread John Phillips
so why do you care what the input is as long as you know what it is and how
to make it do what you want?


On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon wrote:

> Pure conjecture:  So that the reading on the 34401A matches that on a $20
> DVM.
>
> Or stated differently:  So that the input impedance is the same as other
> DVMs.
>
> Brent
>
>
> On 4/10/2014 8:23 AM, Tony wrote:
>
>> There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
>> range so why would they make 10M ohm the default? I can think of very few
>> cases where having the 10M ohm i/p resistor switched  in is better for
>> accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant errors
>> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
>> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
>> error - 502.488mV instead of 500.000mV. That might not be a problem but I
>> wouldn't be surprised if this catches a lot of people out (including me)
>> when not pausing to do the mental arithmetic to estimate the error. It's
>> just too easy to be seduced by all those digits into thinking you've made
>> an accurate measurement even though you discarded those last three digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6 1/2
>> digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is turned on
>> when measuring high impedance sources (e.g. capacitor leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of other
>> over-voltage protection. OK. it provides a path for the DC amplifier's
>> input bias current, specified to be < 30pA at 25 degrees C, but I imagine
>> that varies significantly from one meter to the next, and with temperature,
>> so not useful for nulling out that error.
>>
>> So why would they do this?
>>
>
> ___
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/
> mailman/listinfo/volt-nuts
> and follow the instructions there.
>



-- 
John Phillips
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Brent Gordon
Pure conjecture:  So that the reading on the 34401A matches that on a 
$20 DVM.


Or stated differently:  So that the input impedance is the same as other 
DVMs.


Brent

On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 
10V range so why would they make 10M ohm the default? I can think of 
very few cases where having the 10M ohm i/p resistor switched  in is 
better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant 
errors on a 6 1/2 digit DVM for sources with resistances as low as 10 
ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
causes a .5% error - 502.488mV instead of 500.000mV. That might not be 
a problem but I wouldn't be surprised if this catches a lot of people 
out (including me) when not pausing to do the mental arithmetic to 
estimate the error. It's just too easy to be seduced by all those 
digits into thinking you've made an accurate measurement even though 
you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 6 
1/2 digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is turned 
on when measuring high impedance sources (e.g. capacitor leakage 
testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of other 
over-voltage protection. OK. it provides a path for the DC amplifier's 
input bias current, specified to be < 30pA at 25 degrees C, but I 
imagine that varies significantly from one meter to the next, and with 
temperature, so not useful for nulling out that error.


So why would they do this?


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tony
That seems a more likely reason - matching users' expectations. It's the 
unexpected that trips people up - I doubt many casual users of DVMs ever 
see the manuals. I still think it was the wrong choice.


Tony H

On 10/04/2014 18:58, Joel Setton wrote:
I think the 10 Meg default value became a de facto standard at the 
time of VTVMs (vacuum-tube volt meters), as a convenient value which 
reduced input circuit loading while remaining compatible with the grid 
current of the input triode. Designers of early solid-state voltmeters 
merely decided not to change a good thing.

Just my $0.02 worth!

Joel Setton


On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high 
voltages
without much drift. Caddock THV or HVD are fairly common in precision 
dmms.


Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. 
As you

mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other 
than
it may extend the life of the relay that switches the 10M divider in 
or out.


Steve




___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to 
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts

and follow the instructions there.



___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tony
Possibly so, but I'd expect that the vast majority of HP 34401A 6-1/2 
digit bench multimeters never see/saw > 1kV - even when CRT's were 
common. Looking at the specs for several HV probes (1% accuracy was the 
best I could find with a quick search), a $10, 3-1/2 digit DVM will be 
just as accurate.


 "Many of the accurate ones want to see a 10 meg input."

Well if you want *high* accuracy with such a probe, then you wouldn't 
use a 34401A given that it's 10M ohm /P dividor has a tolerance of ± 1% 
which would limit the overall accuracy to around .1% at best (for a 
1Gohm, 1000:1 passive HV probe).


The 34401A only offers 10G ohm i/p resistance on the .1, 1 and 10V 
ranges, switching to 10M ohm on 100 and 1kV ranges. So selecting the 
100V range (much easier than using the menus to change the i/p 
resistance) automatically selects the 10M ohm i/p resistance. Using a 
1000:1 probe, voltages between 1kV and 10kV would lose a digit of 
resolution compared to using the 10V range, but 5-1/2 digits is still 
way more than needed given the .1% accuracy limited by the 34401A's i/p 
resistance tolerance.


All in all, I think providing a minor convenience feature for HV probe 
users (not having to manually select the 100V range) is a very unlikely 
reason for selecting 10M as the default given that way more measurements 
(source > 10 ohms) require the 10G ohm i/p resistance to justify using a 
6-1/2 digit instrument.


Tony H

On 10/04/2014 16:07, Tom Miller wrote:

Think "HV Probe". Many of the accurate ones want to see a 10 meg input.

Also, some meters change input impedance depending on the selected range.

T

- Original Message - From: "Tony" 
To: 
Sent: Thursday, April 10, 2014 10:23 AM
Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?


There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 
10V range so why would they make 10M ohm the default? I can think of 
very few cases where having the 10M ohm i/p resistor switched in is 
better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant 
errors on a 6 1/2 digit DVM for sources with resistances as low as 10 
ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
causes a .5% error - 502.488mV instead of 500.000mV. That might not 
be a problem but I wouldn't be surprised if this catches a lot of 
people out (including me) when not pausing to do the mental 
arithmetic to estimate the error. It's just too easy to be seduced by 
all those digits into thinking you've made an accurate measurement 
even though you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 6 
1/2 digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep 
going into the measurement menus to change it when the meter is 
turned on when measuring high impedance sources (e.g. capacitor 
leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of 
other over-voltage protection. OK. it provides a path for the DC 
amplifier's input bias current, specified to be < 30pA at 25 degrees 
C, but I imagine that varies significantly from one meter to the 
next, and with temperature, so not useful for nulling out that error.


So why would they do this? Could it be psychological? By limiting the 
drift caused by the i/p bias current to 300uV max when the meter is 
left unconnected? A voltmeter with a rapidly drifting reading 
(several mV/s) when not connected to anything is a bit disconcerting 
and would probably lead to complaints that the meter is obviously 
faulty to users who are used to DVMs which read 0V when open circuit 
- because they have i/p resistance << 10G ohms and don't have the 
resolution to show the offset voltage caused by the i/p bias current.


Personally I'd have though that the default should be the other way 
round - especially given that there is no indication on the front 
panel or display as to which i/p resistance is currently selected.


Any thoughts? What do other meters do?

Tony H


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread John Phillips
It is more along the lines of building a voltage divider with stable
resistors.  A !G ohm voltage divider would be more expensive to build with
the same stability that you can get from 10M ohms. There are trade offs in
all designs. The cost benefit ratio just is not there. If you really need
high input impedance in a DC meter go  differential.  If you are looking
for AC circuits then stray capacitance will really mess you up with a high
impedance divider.



On Thu, Apr 10, 2014 at 12:27 PM, Andreas Jahn
wrote:

> Hello,
>
> perhaps its just to save the lifetime of the input range selection relays
> to at least the warranty time.
> Just a guess.
>
> With best regards
>
> Andreas
>
> Am 10.04.2014 19:58, schrieb Joel Setton:
>
>  I think the 10 Meg default value became a de facto standard at the time
>> of VTVMs (vacuum-tube volt meters), as a convenient value which reduced
>> input circuit loading while remaining compatible with the grid current of
>> the input triode. Designers of early solid-state voltmeters merely decided
>> not to change a good thing.
>> Just my $0.02 worth!
>>
>> Joel Setton
>>
>>
>> On 10/04/2014 18:55, Steven J Banaska wrote:
>>
>>> As Tom said the 10M input impedance is used for the high voltage ranges
>>> because it is a resistive divider (9.9M/100k) that can handle high
>>> voltages
>>> without much drift. Caddock THV or HVD are fairly common in precision
>>> dmms.
>>>
>>> Typically you will find a high impedance (10G) path that can be used for
>>> the ranges 10V and lower, but the 10M divider can be left connected and
>>> will work for any voltage range by changing which side you measure. As
>>> you
>>> mentioned there can be an accuracy sacrifice when you have a high output
>>> impedance from your source. I'm not sure why 10M is the default other
>>> than
>>> it may extend the life of the relay that switches the 10M divider in or
>>> out.
>>>
>>> Steve
>>>
>>>
>>>
>> ___
>> volt-nuts mailing list -- volt-nuts@febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>> mailman/listinfo/volt-nuts
>> and follow the instructions there.
>>
>
> ___
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/
> mailman/listinfo/volt-nuts
> and follow the instructions there.
>



-- 
John Phillips
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Andreas Jahn

Hello,

perhaps its just to save the lifetime of the input range selection 
relays to at least the warranty time.

Just a guess.

With best regards

Andreas

Am 10.04.2014 19:58, schrieb Joel Setton:
I think the 10 Meg default value became a de facto standard at the 
time of VTVMs (vacuum-tube volt meters), as a convenient value which 
reduced input circuit loading while remaining compatible with the grid 
current of the input triode. Designers of early solid-state voltmeters 
merely decided not to change a good thing.

Just my $0.02 worth!

Joel Setton


On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high 
voltages
without much drift. Caddock THV or HVD are fairly common in precision 
dmms.


Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. 
As you

mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other 
than
it may extend the life of the relay that switches the 10M divider in 
or out.


Steve




___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to 
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts

and follow the instructions there.


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Poul-Henning Kamp
In message <5346a952.9080...@toneh.demon.co.uk>, Tony writes:

>There is no suggestion in the specifications for the 34401A that the 
>accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V 
>range so why would they make 10M ohm the default? 

In addition to the compatibility reasons others have mentioned, it
also protects the input circuits against random elctrostatic fluctuations
if nothing is attached, and it delays things just long enough that
if you attach +100V, it doesn't have to wait for a relay to kick in
before autorange can work.

-- 
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
p...@freebsd.org | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Brooke Clarke

Hi Tony:

Fluke makes some DDMs that have what they call V-Check where they put a 1,000 
Ohm resistor across the voltage input.
When testing lawn sprinkler valves if you measure the voltage across the valve with a Hi-Z voltmeter it looks normal, 
but using the V-Check range on the DMM shows the voltage to be almost zero.

http://www.prc68.com/I/DMM.shtml

Have Fun,

Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html

Tony wrote:
There is no suggestion in the specifications for the 34401A that the accuracy suffers by selecting 10G ohm input 
resistance on the .1 to 10V range so why would they make 10M ohm the default? I can think of very few cases where 
having the 10M ohm i/p resistor switched  in is better for accuracy than not.


On the other hand 10M is sufficiently low to produce significant errors on a 6 1/2 digit DVM for sources with 
resistances as low as 10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% error - 
502.488mV instead of 500.000mV. That might not be a problem but I wouldn't be surprised if this catches a lot of 
people out (including me) when not pausing to do the mental arithmetic to estimate the error. It's just too easy to be 
seduced by all those digits into thinking you've made an accurate measurement even though you discarded those last 
three digits.


And if it's not a problem then you probably don't need an expensive 6 1/2 digit 
meter in the first place.

It's a small point I agree but it can get irritating to have to keep going into the measurement menus to change it 
when the meter is turned on when measuring high impedance sources (e.g. capacitor leakage testing).


It can't be to improve i/p protection as 10M is too high to make any significant difference to ESD and in any case 
there is plenty of other over-voltage protection. OK. it provides a path for the DC amplifier's input bias current, 
specified to be < 30pA at 25 degrees C, but I imagine that varies significantly from one meter to the next, and with 
temperature, so not useful for nulling out that error.


So why would they do this? Could it be psychological? By limiting the drift caused by the i/p bias current to 300uV 
max when the meter is left unconnected? A voltmeter with a rapidly drifting reading (several mV/s) when not connected 
to anything is a bit disconcerting and would probably lead to complaints that the meter is obviously faulty to users 
who are used to DVMs which read 0V when open circuit - because they have i/p resistance << 10G ohms and don't have the 
resolution to show the offset voltage caused by the i/p bias current.


Personally I'd have though that the default should be the other way round - especially given that there is no 
indication on the front panel or display as to which i/p resistance is currently selected.


Any thoughts? What do other meters do?

Tony H
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.



___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Joel Setton
I think the 10 Meg default value became a de facto standard at the time 
of VTVMs (vacuum-tube volt meters), as a convenient value which reduced 
input circuit loading while remaining compatible with the grid current 
of the input triode. Designers of early solid-state voltmeters merely 
decided not to change a good thing.

Just my $0.02 worth!

Joel Setton


On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high voltages
without much drift. Caddock THV or HVD are fairly common in precision dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other than
it may extend the life of the relay that switches the 10M divider in or out.

Steve




___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Steven J Banaska
As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high voltages
without much drift. Caddock THV or HVD are fairly common in precision dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other than
it may extend the life of the relay that switches the 10M divider in or out.

Steve



On Thu, Apr 10, 2014 at 8:07 AM, Tom Miller wrote:

> Think "HV Probe". Many of the accurate ones want to see a 10 meg input.
>
> Also, some meters change input impedance depending on the selected range.
>
> T
>
> - Original Message - From: "Tony" 
> To: 
> Sent: Thursday, April 10, 2014 10:23 AM
> Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?
>
>
>
>  There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
>> range so why would they make 10M ohm the default? I can think of very few
>> cases where having the 10M ohm i/p resistor switched  in is better for
>> accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant errors
>> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
>> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
>> error - 502.488mV instead of 500.000mV. That might not be a problem but I
>> wouldn't be surprised if this catches a lot of people out (including me)
>> when not pausing to do the mental arithmetic to estimate the error. It's
>> just too easy to be seduced by all those digits into thinking you've made
>> an accurate measurement even though you discarded those last three digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6 1/2
>> digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is turned on
>> when measuring high impedance sources (e.g. capacitor leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of other
>> over-voltage protection. OK. it provides a path for the  DC amplifier's
>> input bias current, specified to be < 30pA at 25 degrees C, but I imagine
>> that varies significantly from one meter to the next, and with temperature,
>> so not useful for nulling out that error.
>>
>> So why would they do this? Could it be psychological? By limiting the
>> drift caused by the i/p bias current to 300uV max when the meter is left
>> unconnected? A voltmeter with a rapidly drifting reading (several mV/s)
>> when not connected to anything is a bit disconcerting and would probably
>> lead to complaints that the meter is obviously faulty to users who are used
>> to DVMs which read 0V when open circuit - because they have i/p resistance
>> << 10G ohms and don't have the resolution to show the offset voltage caused
>> by the i/p bias current.
>>
>> Personally I'd have though that the default should be the other way round
>> - especially given that there is no indication on the front panel or
>> display as to which i/p resistance is currently selected.
>>
>> Any thoughts? What do other meters do?
>>
>> Tony H
>> ___
>> volt-nuts mailing list -- volt-nuts@febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>> mailman/listinfo/volt-nuts
>> and follow the instructions there.
>>
>
> ___
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/
> mailman/listinfo/volt-nuts
> and follow the instructions there.
>
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.


Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

2014-04-10 Thread Tom Miller

Think "HV Probe". Many of the accurate ones want to see a 10 meg input.

Also, some meters change input impedance depending on the selected range.

T

- Original Message - 
From: "Tony" 

To: 
Sent: Thursday, April 10, 2014 10:23 AM
Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?


There is no suggestion in the specifications for the 34401A that the 
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V 
range so why would they make 10M ohm the default? I can think of very few 
cases where having the 10M ohm i/p resistor switched  in is better for 
accuracy than not.


On the other hand 10M is sufficiently low to produce significant errors on 
a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. 
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% 
error - 502.488mV instead of 500.000mV. That might not be a problem but I 
wouldn't be surprised if this catches a lot of people out (including me) 
when not pausing to do the mental arithmetic to estimate the error. It's 
just too easy to be seduced by all those digits into thinking you've made 
an accurate measurement even though you discarded those last three digits.


And if it's not a problem then you probably don't need an expensive 6 1/2 
digit meter in the first place.


It's a small point I agree but it can get irritating to have to keep going 
into the measurement menus to change it when the meter is turned on when 
measuring high impedance sources (e.g. capacitor leakage testing).


It can't be to improve i/p protection as 10M is too high to make any 
significant difference to ESD and in any case there is plenty of other 
over-voltage protection. OK. it provides a path for the  DC amplifier's 
input bias current, specified to be < 30pA at 25 degrees C, but I imagine 
that varies significantly from one meter to the next, and with 
temperature, so not useful for nulling out that error.


So why would they do this? Could it be psychological? By limiting the 
drift caused by the i/p bias current to 300uV max when the meter is left 
unconnected? A voltmeter with a rapidly drifting reading (several mV/s) 
when not connected to anything is a bit disconcerting and would probably 
lead to complaints that the meter is obviously faulty to users who are 
used to DVMs which read 0V when open circuit - because they have i/p 
resistance << 10G ohms and don't have the resolution to show the offset 
voltage caused by the i/p bias current.


Personally I'd have though that the default should be the other way 
round - especially given that there is no indication on the front panel or 
display as to which i/p resistance is currently selected.


Any thoughts? What do other meters do?

Tony H
___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to 
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there. 


___
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.