Further to Rich's comments...

>----------
>From:  Rich Nute[SMTP:ri...@sdd.hp.com]
>Sent:  Thursday, December 04, 1997 10:38 AM
>
>Hello Vagn:
>
>
>1.  Why do standards specify a minimum output current for
>    hi-pot testers?
>
<deletia>

>Another reason for a minimum current requirement is that some
>of the early hi-pot testers employed a "collapsing field"
>transformer.  This was for the safety of operating personnel.
>When the load required too much current, the transformer would
>limit the current to a "safe" level.  The problem with this
>construction is that as the load increased (e.g., by the Y
>capacitors), the voltage waveform would become highly distorted,
>with high peak voltage, but low rms voltage.  The output voltmeter
>indicated the rms voltage was decreasing rather than increasing. 
>So, the operator would continue to increase the voltage control, 
>thus increasing the peak voltage and thereby inducing a dielectric
>failure!  One way to avoid this construction was to specify a
>minimum current or a minimum VA. 
>
>Best regards,
>Rich
>-------------------------------------------------------------
> Richard Nute
>

This sounds like a core saturation/loading effects on the transformer.
Inadequately sized transformers can have substantial distortion at the
output when the core flux levels are too high.  Also, since these
testers have traditionally been simple linear transformers, they are
subject to loading effects where they cannot sustain or achieve the
required voltage, even prior to core saturation, as their loading
increases.  This latter effect could be due to too small a diameter wire
where the winding voltage drop is too great and/or the core approaching
saturation.

Regards,


Peter L. Tarver
Nortel
ptar...@nt.com

Reply via email to