Part of the equation is certification. If the equipment needs to be certified
as within specification, it must be tested with equipment whose calibration has
been verified by an official standards laboratory. That way its calibration is
traceable to the world standards for the parameters. This is not trivial, as
you can't be sure your standard meter is correct or even within its
specification otherwise.
A voltage source used for calibrating a voltmeter must be stable, accurate, and
noise-free. The same is true for other 'live' calibrations.
Passive calibrations, such as resistors and other components need similar
certification.
Having said all this, there are other considerations. Test methods are very
important. For instance, measuring a resistor must be done at a particular
lead length and temperature and voltage stress, with due diligence regarding
dissimilar metals. Frequency can be calibrated to a point by WWV radio
transmissions. Digital readings always have an uncertainty of at least one
digit. Capacitance and inductance measurements need to consider losses, namely
Q and D and need to be done either equivalent series or equivalent parallel,
depending on the application. Nonlinear devices such as iron core inductors
have a far more complex definition of parameters and it's not simple to set up
a valid test.
So your boss has handed you an open can of worms.
Bob
On Wednesday, June 20, 2018, 12:54:28 PM PDT, Florian Teply
<[email protected]> wrote:
Dear fellow nuts,
just a few days after my last post to this list my boss made me
responsible for calibration of electrical measurement equipment in our
department. As funny a coincidence that might be - I'd be pretty
surprised if he was lurking here - this brought up a few questions
where I could use some insight and comments from guys with some more
experience in that than I have (which essentially is zero).
Now, as far as I understand, calibration at first sight is merely a
comparison between what the meter actually reads and what it is supposed
to read. As long as the difference between the two is smaller than what
the manufacturer specifies as maximum error, everything is fine, put
a new sticker to the instrument and send it back to the owner.
Now, as usual the devil is in the details: How to establish what the
meter is supposed to read. I'd be pretty surprised if it was as easy as
taking two meters and measure the same thing, say, a voltage,
simultaneously and compare the readings. Could some of you guys shed
some more light on that?
Background of my questions is me wondering if it would be feasible to
do the calibration in house instead of sending equipment out for
calibration. I'm not so much looking for financial savings as I doubt
we get to the point where running our own calibration would be cheaper
than contracting it out, even though we have on the order of fifty
multimeters and about as many voltage sources listed, and probably some
more sitting in some cupboards without being listed. I'm rather looking
for convenience, as it is often difficult to arrange in advance when to
calibrate what and send everything in time. With the possibility in
house we could do calibration whenever we like, and whenever equipment
is not in use, and not having to ask a commercial calibration lab to
calibrate a few dozen more or less complex devices within two weeks
maintenance break in summer. And not needing to ship equipment out
would be a plus as well as some of our guys don't like the idea to send
a few hundred kilo euros worth of equipment around just to get a fancy
new sticker on it...
Best regards,
Florian
_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to https://lists.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.
_______________________________________________
volt-nuts mailing list -- [email protected]
To unsubscribe, go to https://lists.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.