JedRothwell Verified User
  • Member since Oct 11th 2014
  • Last Activity:

Posts by JedRothwell

    The D2 artifact issue is not addressed. it is stated as fact that this is not a problem without evidence or rationale.

    Everyone knows that the mass spectrometer must have sufficient resolution to separate these two. Every researcher addresses this. If this particular paper does not, you can be sure others from these authors do. As Ed points out, you are saying that experts do not know the fundamentals, and they make mistakes that only an undergraduate would make.


    You here are criticising me with a straw man argument. I never said this paper claimed that. rather, when i look at all of the LENR eviudence, I look for coherence, or lack of coherence. I am stating that the tritium results are muhc too low to explain excess heat - which matters because if they were comparable then these two apparently different (unless we have LENR) observations would be coherent.

    Coherence is a matter of your opinion, based on your ideas about theory. Cold fusion cannot yet be explained by theory. It is an experimental claim. The tritium is real. That is an experimentally proven fact. You have found no reason to doubt the tritium, or for that matter the heat, or helium. A reason can only be an error in the experimental technique or the instruments. The fact that you find it incoherent is not a valid reason to dismiss experimentally proven facts. Your personal ideas about theory do not overrule instrument readings.


    The tritium does not have to "explain" anything. It exists. That is an irrefutable fact. You can confirm it in this paper, or in dozens of other papers. Tritium can only be the product of a nuclear reaction. Therefore, cold fusion is a nuclear reaction.


    What you, or I, or anyone says about theory or coherence can never be a reason to reject replicated, high sigma experimental results. That is fundamental to the scientific method.

    no mention of battery storage

    Good point. They should have mentioned it. California and other states are rapidly increasing battery storage. See:


    Solar and battery storage to make up 81% of new U.S. electric-generating capacity in 2024 - U.S. Energy Information Administration (EIA)


    Battery storage. We also expect battery storage to set a record for annual capacity additions in 2024. We expect U.S. battery storage capacity to nearly double in 2024 as developers report plans to add 14.3 GW of battery storage to the existing 15.5 GW this year. In 2023, 6.4 GW of new battery storage capacity was added to the U.S. grid, a 70% annual increase.


    Texas, with an expected 6.4 GW, and California, with an expected 5.2 GW, will account for 82% of the new U.S. battery storage capacity.

    WaPost:


    Rooftop solar panels are flooding California’s grid. That’s a problem.


    As electricity prices go negative, the Golden State is struggling to offload a glut of solar power


    Here is a shared link:


    https://wapo.st/3U3zAyg


    QUOTE:


    In sunny California, solar panels are everywhere. They sit in dry, desert landscapes in the Central Valley and are scattered over rooftops in Los Angeles’s urban center. By last count, the state had nearly 47 gigawatts of solar power installed — enough to power 13.9 million homes and provide over a quarter of the Golden State’s electricity.

    But now, the state and its grid operator are grappling with a strange reality: There is so much solar on the grid that, on sunny spring days when there’s not as much demand, electricity prices go negative. Gigawatts of solar are “curtailed” — essentially, thrown away.

    In response, California has cut back incentives for rooftop solar and slowed the pace of installing panels. But the diminishing economic returns may slow the development of solar in a state that has tried to move to renewable energy. And as other states build more and more solar plants of their own, they may soon face the same problems. . . .

    EIA graph in article:

    https://www.eia.gov/todayinenergy/detail.php?id=56880

    Taking the authors speculations at face value we have 10^15 T atoms generated in 400 hours or so. Supposing 4MeV for D+D -> T + H we have 1.6*10^-19*4*10^6*10^15 = 640J => 0.5mW over the period. Much much smaller than claimed excess heat in other such experiments.

    Anyone who has read the literature will know that the tritium is about a million times too small to explain the heat. No one has ever suggested that the tritium can explain the heat. THH's statement is either grossly ignorant, or it is trolling.

    Wolfe understood Kramer to be saying, in Wolfe’s words, “In short: frankly, these days, without a theory to go with it, I can’t see a painting.”'

    That is perhaps an exaggeration.

    to lack a persuasive theory is to lack something crucial—the means by which our experience of individual works is joined to our understanding of the values they signify.”

    Perhaps what he had in mind is that without knowledge of the place, time and culture it can be difficult to know what some paintings signify. Especially the allegorical ones such as Delacroix, "Liberty Leading the People" (1830) or Picasso's "Guernica." Imagine you are trying to explain Delacroix to a Japanese person in 1830 who has no knowledge of the French Revolution. What would he make of this? He would probably see it is a war, but which side does this represent, and what is that half-naked woman doing? Is it supposed to be a joke? Or erotic? Is the artist in favor of these people or against them? It might not seem patriotic in the sense Delacroix had in mind. I know that Japanese people first exposed to western paintings in the Meiji era had very different ideas about what the paintings meant, whether they were beautiful or ugly, and what on earth all those naked people were doing. Naked people everywhere! -- as I recall Thomas Eakins wrote when he first went to European museums. He was not impressed.



    Just offhand, my guess is that a Japanese person in 1830 might think that woman is demon or a badger transformed into a person, or a ghost, and she is up to no good. She is leading those people into some sort of trap. Half-naked beautiful women in Japanese folk tales often turn out to be demons leading men to their demise. If you see an allegorical Japanese painting from 1830 with a figure like this, beware of her. Here is a famous example, the Snow Woman, who is often portrayed more erotically than this. She entices lost men, and then freezes them to death. A European or an American who cannot read the caption, which is "yuki onna" (Snow Woman), might not see anything frightening about this image, but anyone who can read it and knows Japanese folklore will know She Is Out To Get You.


    Maybe nuclear is not the lost cause she thinks?

    I think it is a lost cause. It is obsolete, and too expensive. It is also dangerous, whereas there is no danger from wind and solar, which are presently the only kind of new generators the power companies want.


    Solar and battery storage to make up 81% of new U.S. electric-generating capacity in 2024 - U.S. Energy Information Administration (EIA)


    I am not saying nuclear power cannot be fixed no matter what. It can probably be improved. It could be made safer and cheaper. But I think it is too late for that. Once a technology falls behind the competition, it can rarely be improved enough to compete again. Because the rival technologies are also improving, and they are getting more money. You see that pattern often, especially in big ticket technology such as railroads, ships, generators and so on. When steam locomotives were losing to Diesels in the 1930s and even up to the 1950s, some people made steam turbine locomotives. It was a last ditch effort to keep external combustion locomotive technology alive. Steam turbines were well developed for ships and generators. But this effort failed.


    Pebble bed nuclear reactors and other radical new designs might be competitive. I cannot judge.


    Here is a shared link to a long article about conventional nuclear energy:


    Opinion | Tripling the World’s Nuclear Energy Capacity Is a Fantasy (Gift Article)
    The nuclear industry has a long history of failing to deliver on its promises.
    www.nytimes.com


    QUOTE:


    . . . [P]ledging to triple nuclear capacity by 2050 is a little like promising to win the lottery. For the United States, it would mean adding an additional 200 gigawatts of nuclear operating capacity (almost double what the country has ever built) to the 100 gigawatts or so that now exists, generated by more than 90 commercial reactors that have been running an average of 42 years. Globally it would mean tripling the existing capacity built over the past 70 years in less than half that time in addition to replacing reactors that will shut down before 2050.

    The Energy Department estimates the total cost of such an effort in the United States at roughly $700 billion. But David Schlissel, a director at the Institute for Energy Economics and Financial Analysis, has calculated that the two new reactors at the Vogtle plant in Georgia — the only new reactors built in the United States in a generation — on average, cost $21.2 million per megawatt in today’s dollars — which translates to $21.2 billion per gigawatt. Using that figure as a yardstick, the cost of building 200 gigawatts of new capacity would be far higher: at least $4 trillion, or $6 trillion if you count the additional cost of replacing existing reactors as they age out.

    For much less money and in less time, the world can reduce greenhouse gas emissions through the use of renewables like solar, wind, hydropower and geothermal power, and by transmitting, storing and using electricity more efficiently. . . .


    An AI does a much better job, in response to a query such as:


    "What models have been proposed to explain the role of high loading in bulk-Pd D2O cold fusion?"

    Okay, I think a dedicated bot with a database of cold fusion papers would do a better job. The general purpose ChatGPT bot does not do a good job addressing this question. It says:



    The phenomenon often referred to as "cold fusion" in the context of palladium (Pd) and heavy water (D2O) has been a subject of controversy and interest since it was first reported in 1989. Various models have been proposed to explain observations of anomalous heat and nuclear products in such systems, particularly under high loading conditions where the palladium lattice is highly saturated with deuterium. Here are some of the main models that have been proposed:

    1. Electron Screening Model: This model suggests that electrons in the palladium lattice reduce the Coulomb barrier between deuterium nuclei (deuterons), facilitating nuclear reactions at lower energies than would normally be required in thermonuclear fusion. The dense electron cloud surrounding the palladium lattice is thought to partially shield the positive charges of the deuterons, allowing them to come close enough to undergo fusion.

    2. Boson Condensation Model: In this model, deuterium loaded into a palladium lattice is proposed to form a Bose-Einstein condensate. This quantum state could potentially enable nuclear reactions at room temperatures due to quantum mechanical effects that allow overlapping wave functions of deuterons. The theory suggests that under certain conditions, a collective state can be achieved where multiple deuterons occupy the same quantum state, leading to an increased probability of tunneling through the Coulomb barrier.

    3. Quantum Mechanical Tunneling: This approach focuses on the quantum mechanical property of tunneling, where particles traverse energy barriers that they classically should not be able to overcome. High loading of deuterium in palladium increases the probability of deuterons being close enough to tunnel through the Coulomb barrier and fuse, despite the low kinetic energies involved at room temperature.

    4. Fracto-fusion Models: These models propose that micro-cracks and imperfections in the palladium lattice under high deuterium loading conditions could create localized sites where fusion is more likely to occur. The idea is that mechanical stress and the resulting fracturing could generate hot spots or unusual conditions conducive to fusion.

    5. Metal Hydride Models: In this scenario, the unique properties of palladium hydride and deuteride under high loading conditions alter the nuclear properties of deuterium atoms embedded within the lattice. These altered properties might include changes in the electron screening potential or modifications of the deuterium nuclear states, potentially leading to enhanced fusion rates.

    These models aim to address the key challenge in cold fusion: explaining how nuclear reactions typically requiring high energies and temperatures could occur at room temperature and without significant radiation, as traditional nuclear reactions would predict. Despite ongoing research, cold fusion remains a contentious topic in the scientific community, with many experiments failing to reliably reproduce the initial claims, leading to skepticism about the feasibility of these processes as described. The models are largely theoretical and have not been conclusively proven or widely accepted in the scientific community.


    Knowing what is known is important.

    Yup. And an AI can be a big help with this.


    However, this is not the problem. The problem starts when the behaviors are combined to create a model explaining their relationship to the fusion reaction.

    Today's AI itself cannot create a model. However, if an author somewhere proposed a model, you -- the reader -- probably did not hear about it. Because there are so many authors and so many models in the literature. So, if you ask the AI to find a model with thus-and-such characteristics, that addresses this-and-that experimental data, the AI might find what you are looking for. It will find it much faster and more easily than you would by manually slogging through the papers.


    I might find it with my local computer text search program FileLocatorPro. It indexes every paper in the LENR-CANR.org collection plus the Storms collection papers. The problem is that it finds far too much. In the LENR document file, you put in a search for:


    model NEAR "high loading"


    (Meaning the word "model" near the words "high loading")


    It comes up with 21 papers. With quotes such as:


    "2. Equation of State Models for H2


    As discussed in Section 1, our understanding of the chemical potential of hydrogen in palladium hydride at high loading requires an understanding of the equation of state of gaseous and liquid hydrogen at high pressure. In connection with the discussion below, we recall the ideal gas law written as . . ."


    BiberianJPcondensed.pdf


    It gives too much unorganized information. An AI does a much better job, in response to a query such as:


    "What models have been proposed to explain the role of high loading in bulk-Pd D2O cold fusion?"

    Additionally, we applied unsupervised machine learning algorithms to the data to generate clusters of publications based on semantics and other features. Through interactive interfaces, we enable targeted investigation of specific reported phenomena.

    That sounds do-able. That is the kind of thing computers excel at. Especially AI computers.


    By moving beyond isolated results to higher level knowledge, this work aims to advance the field by providing revealing relationships in the collective evidence.

    Well . . . a generative ChatGPT style AI is like Hamlet's ghost. It can never tell you something you don't know already. Or I should say, it can only tell you something that someone, somewhere, knew and wrote down. It cannot have an original thought, or synthesize new knowledge. It is a superfast idiot savant assistant who actually understands nothing.


    Other AI models and future AI will generate original ideas and come up with connections that people did not.


    An AI can reveal relationships to you because you cannot possibly read all of the literature, or remember every detail, or see every connection that the authors and their words reveal. That is extremely valuable. For that matter, an old fashioned printed paper encyclopedia was valuable. A Google Search is valuable. An AI is even better because it addresses your question by finding what you want to know and excluding information you are not looking for. An encyclopedia may have many paragraphs describing aspects of your question that you don't want to read, and it may not have the part you are looking for.


    Our preliminary results and the first version of our tools will be useful for scientists already in the LENR field, as well as for those considering research on the topic.

    I hope so. Scientists apparently did not find my AI useful. Or perhaps they just did not think to use it.


    People sometimes fail to make use of new technology. They do not realize what advantages it has. They do not know how to use it.


    When the first public telegraph was installed between Washington DC and Baltimore in 1844, no one used it for weeks because no one understood the benefit of instant communication. Finally, people began using it for stock market quotes, where knowing something first is a big advantage. It became very popular. Within a few years, telegraph companies were big business.


    The U.S. Navy lost the Battle of Savo Island in part because officers and ship captains did not understand radar, and did not take advantage of it. Granted, it was imperfect and difficult to use. See:


    Reversal of Fortune
    How the bitter defeat at the Battle of Savo Island in 1942 inspired a tactical reset that led to U.S. Pacific victories in 1943.
    www.usni.org


    QUOTE:


    "The maritime environment in the South Pacific, which differed greatly from the North Atlantic, where initial SC radar testing had been accomplished, also significantly reduced detection ranges and sensitivity, unbeknownst to U.S. commanders. Lack of knowledge about radar meant a lack of trust among more senior naval officers throughout the subsequent battles around Guadalcanal. Even Commander Arleigh Burke, who would learn to capitalize on the radar advantage, hesitated in his first action against the enemy because of his mistrust of the information provided by his radar operator."

    The best flow calorimeter I have seen is that of Jaques Ruer design and used in JP Biberian's lab.

    That is impressive, especially for an air flow calorimeter.


    Any flow calorimeter will be well insulated. Thus, most of the heat is captured by the flowing fluid. The heat capacity of the fluid is well established, so you can easily estimate total heat capture based on first principles. It is easier to understand than isoperibolic or Seebeck calorimetry, and it is less dependent on calibrations. Mike McKubre has often pointed this out.


    For these reasons, any experiment with good flow calorimetry and significant excess heat, even 5% of input power, will exceed the upper limit for heat losses. Obviously, any experiment with no input power and detectable heat also exceeds these limits, with any kind of calorimetry. There are many experiments like that in the literature. THH could have found many on his own. Or he might have read and analyzed McKubre, the way he said he would a year ago. Of course he never will do that. Now that he has invented the imaginary problem of heat recovery he will go on claiming it is real, and he will never cite an actual example or read any of the papers I tell him show it is not real. He is a troll.


    Why not read the tritium paper that Jed already suggested?

    Well, he has said that the tritium is contamination, but in this case he is looking for evidence that the heat is an artifact of heat losses. It would be easiest to spot that with a flow calorimeter. There are many papers about flow calorimeters such as McKubre. I have often suggested THH look at McKubre, but he never has looked, and he never will.


    You can also prove this with any cell that has no input power, such as a cell in heat after death or a gas loaded cell.


    It is somewhat more difficult to prove this with an isoperibolic cell with input power, but you can with first principle analyses such as F&P or Miles.

    Link to specific experiment please, and I will tell you what is your mistake. I will not spend a few hours searching many papers, so it will need to be precise.

    Take your pick. Most papers with significant excess heat far exceed heat losses. As I said, you can ignore the heat losses (assume 100% recovery) and there is still abundant excess heat.


    I will not spoon feed you the papers. You have shown dozens of times that you will not read them even if I do.


    You will not tell me what is my mistake. You have never done that. Not once. You will invent some preposterous hypothesis that anyone can see is wrong, and you will ignore overwhelming evidence that it is wrong, such as the fact that excess heat is correlated with helium and your pretend errors cannot possibly affect both a calorimeter and three helium detectors.

    But it is they always a matter of subjective judgement how much weight you give to unusual non-mainstream interpretations as against the boring default view. Some people are more disposed to do this than others - and surely they would be more inclined to see LENR as real.

    You have that backward. As Martin Fleischmann said, we are painfully conventional people. We see that cold fusion is real because we believe in the laws of thermodynamics; we know that calorimeters invented 150 years ago work as described; we know that results of 300% excess heat cannot be explained away by heat loss errors of 1%; and we know that an error in a calorimeter cannot magically cause an error in helium detectors a thousand miles away. We know that widely replicated high sigma results are the standard of truth in science, whereas you frantically look for some other standard -- any other standard! -- rather than accept the textbook definition of the scientific method. Cold fusion being real is the boring, standard, default conclusion made by conventional experts who look at the facts, such as Gerischer, the Director of the Max Planck Institute for Physical Chemistry. He reviewed the evidence in 1991 and concluded “there [are] now undoubtedly overwhelming indications that nuclear processes take place in metal alloys.” (https://lenr-canr.org/acrobat/GerischerHiscoldfusi.pdf)


    YOU are the one with the overactive imagination. You connect invisible dots to come up with extravagancies and absurdities, as Franklin put it:


    "Perhaps the history of the errors of mankind, all things considered, is more valuable and interesting than that of their discoveries. Truth is uniform and narrow; it constantly exists, and does not seem to require so much an active energy, as a passive aptitude of soul in order to encounter it. But error is endlessly diversified; it has no reality, but is the pure and simple creation of the mind that invents it. In this field, the soul has room enough to expand herself, to display all her boundless faculties, and all her beautiful and interesting extravagancies and absurdities."


    -- Benjamin Franklin, Report of Dr. Benjamin Franklin, and Other Commissioners, Charged by the King of France, with the Examination of the Animal Magnetism, as Now Practiced in Paris (1784)