At 03:07 AM 12/13/2012, Jeff Berkowitz wrote:

I am looking at 1 minute averages. This is very solid.

Okay. This would not detect "invisible" excess input power due to power supply high-frequency variations. At all.

This is what SRI did. They used a constant-current power supply, with high-bandwidth control. The supply, then, faced with transients in resistance, rapidly varies the voltage. So voltage is sampled at high frequency, and is averaged and reported periodically.

However, it's rather obvious, there must be some variation in current, or the supply would not "know" to alter the voltage. Supplies actually produce constant voltage naturally, if they are beefy enough, which they usually are. Internal feedback rapidly changes the voltage to maintain constant current, when the supply is in constant current mode.

What Britz studied was the effect of current noise. It was very low. If the current is tightly controlled, power remains the product of average voltage times the constant current. Thus the challenged assumption was "constant current."

As McKubre has written, these supplies -- at least the one he used, which was documented -- are very good.

To be sure, workers in the field have examined the current with high-bandwidth oscilloscopes. (They had not documented this in the papers, one cannot possibly, in normally-published papers, document *everything*, but we asked.) They don't see the high-frequency noise that would cause a problem.

The researchers should nail this down, and check for true solidity in the power supply, otherwise, indeed, high-frequency noise could cause misreporting of input power.

Reply via email to