I do see that. In fact, I've often noted here in the past that any reactor that makes substantially more heat than it requires can be made self running with enough insulation and by sending the output heat back to the reactor. And yes, it could be dangerous (but don't tell that to robert bryant .
But the point is, you are not trying to measure an output of 101 Watts while putting in 100 Watts which is how too many experiments have been reported and judged to be positive. You need an impressive output/input ratio to be reasonably sure that your signal is substantially over any possible noise, ground loops, induced EMF, and other errors and problems. Of course, good calibration also helps.
ETA: Of course, if you used a liquid cooled, temperature-controlled, forced flow calorimeter like SGVIT demonstrated, you would have better control of the experiment's temperature and better safety. But I do understand that such a system is much more complicated to build and control and if the air calorimeter works, you wouldn't want it.
Well seven_of_twenty, you brought an interesting number up. Last time I checked, no hot fusion experiment has been able to generate even 1% of excess heat (a number that would be equivalent to the 100 w input to 101 w output you mentioned). Yet, if the NIF would announce a 1% excess heat, wouldn’t it be greeted as a major milestone, an historical moment in mankind’s development? Why then a LENR excess heat of 1% is despised?