I have been working on an algorithm to detect excess heat in a slightly different way, but wanted to see if others could perhaps provide some insight.
Here are my thoughts: 1) utilize an on/off heating and cooling cycle within a specified range (e.g., 1000-1200C) - input power turns off at 1200C and back on at 1000C and repeats 2) compare this to a steady state run at the average temperature from the cycling range (let's say 1100C) If you have a theoretical excess heat running continuously at 20W in both conditions would you expect the average input power in #1 to be greater than #2. I suspect the answer may be no, unless the excess heat is truly capable of self-sustaining the reaction for a period of time. I would welcome any thoughts on this process. I suspect Dave could speak well to this from the perspective of his simulations. Thanks, Jack