Exactly - but we know more now than then - surely a bulk Pd-D experiment could be documented in detail with expected results.
It was documented, by Storms, Cravens and Fleischmann.
If highly sample-dependent a protocol for identifying a large source of good samples could be established.
See:
https://www.lenr-canr.org/acrobat/StormsEhowtoprodu.pdf
Such a source would allow replication and testing with the main loss of replicability removed.
Yes, that is correct. However, as I have repeatedly pointed out, several problems preclude this:
- Very few people are capable of doing it.
- It takes a year or two.
- It costs a lot of money to do.
- No one I know wants to do it.
I do not know what experiment the people at Google did. The Nature paper did not describe it in enough detail. They might have tried to this experiment, but I do not know whether they followed the procedures described by Storms and Cravens, or how closely they might have followed them. Storms and Cravens never heard from the people at Google, so they do not know either.
I have a feeling the people at Google did not follow these protocols because when I asked them, they did not answer, but they seemed to have a low opinion of the protocols. I got a feeling they did not think the protocols work. They did not say that directly.