At 02:54 AM 12/6/2012, you wrote:
Push noise down or raise the signal a high up- this is the basic option.
The first choice is passive, the second active.
Which one will one lead to useful Cold Fusion?

Cart before the horse, Peter. The first issues are scientific, and exploring the parameter space is *more difficult* if, at the same time, high "signal" is required. "Pushing noise down" by careful experimental design can save a lot of money and time.

This is the reality, Peter: we know that the FPHE (Fleischmann-Pons Heat Effect) is real. We don't need massive results for that, the best work and most conclusive work has been with modest heat, but then correlated with helium production. We can definitely use more accuracy in this, but the limits have been on helium capture/collection/measurement, not on heat measurement, the accuracy with heat is generally already adequate.

Sure, some people are going to work on increasing heat production, but increasing *absolute heat production*, we know, can easily be done with a reaction with known characteristics, simply by scaling up. However, there is a serious problem here.

If the exact conditions for heat production are not known, if they depend on very difficult-to-control conditions, such as the exact size and number of cracks in palladium deuteride, as appears to be the case with the FPHE, then your scaled-up experiment might unexpectedly produce a lot more heat than you expected. It's dangerous. Pons and Fleischmann scaled *down* for exactly this reason.

And running experiments by remote control behind blast barriers raises costs even further.

No, first things first. We need much more exploration of the parameter space. Once we know what conditions are effective for setting up the reactions, we can then start to scale up, but that's really the last step.

The main trend today is silent implicit desperation.

No. It's realism: until we know the *mechanism* for the FPHE, we need basic research, and that can be -- and should be -- small-scale. If it's small scale, it makes it possible to run many more variations on an experiment, simultaneously, making the discovery of optimal operating conditions come sooner, most likely. Rossi allegedly ran a thousand experiments before he found his "secret sauce." While I have no idea if he really found a secret sauce, that part of his story is plausible, at least.

As far as I can tell, we don't know and have very little clue as to what the ash might be from NiH reactions.

What we need for heat is enough heat to be satisfied that the reaction is real and the heat is not artifact. Sure, eventually, we will want much more than that. We want enough heat that the reaction leaves behind enough ash to be detected. If the ash is deuterium, this isn't going to be easy, but running experiments longer is about as useful as running them hotter.

First things first.

In a similar way, "reliability" is certainly desirable. However, if we don't have "reliability," if, say, half our experiments shown nothing while the other half, seemingly the same, show significant heat, we are not stopped and we need not -- and should not -- demand reliability before proceeding. Heat/helium was conclusively demonstrated with not-reliable experiments, that is the power of correlation. The "dead cells" serve as controls, such that the hidden variable is all that is varying, plus, of course, the output.

Reply via email to