> For two, most of the things that consume power are not in
> fact consuming exactly a fixed amount of power.  Light bulbs
> go dimmer if you reduce voltage; electrical motors will produce
> less power (torque X rpm) if voltage drops, etc.  Minor blips
> are happening all the time in major grids, and the voltage is
> continuously varying up and down slightly.  If we had to keep
> voltage exactly constant, a real AC power system would be
> nigh-on impossible to build.

        Part of the problem is that an increasing fraction of the grid will
actually draw more power as the voltage decreases. Switching power supplies
will maintain a constant output power provided their input voltage remains
in a reasonable window. Their efficiency is generally the highest at their
design nominal volatage. So a decrease in volage will require them to draw
more current both because more is needed for the same power and they'll need
more power.

        As more and more of the load becomes 'smart', the resiliency starts to go
out of the system. To some extent, the same is true of things like cooling
systems. As the voltage drops, their duty cycle will increase, though this
problem manifests itself over a slightly longer term.

        And, of course, you can't keep the voltage constant. It's the differences
in voltage that make the current flow.

        DS


Reply via email to