Posts by Zephir_AWT

    Despite the mainstream physicists pretend, that everything around us is driven with thermodynamical laws and spontaneously increasing entropy (which defines so-called entropic time arrow), in real world around us we can find many examples of entropic time arrow violation. In my theory the overunity arises during thermalization of negentropic phenomena, like the overheating or oversaturation, which violate the entropic time arrow by formation of metastable state. The magnetization of ferromagnetics (especially these ones in form of thin needles or whiskers, where ferromagnetic domains have no place for reorientation) is one example of metastable state. The formation of metastable undercooled atomic hydrogen from electric arc or NMR excitation of carbon or iron nuclei in magnetic field are another example of such negentropic phenomena, the thermalization of which can be assisted with vacuum fluctuations.

    In my understanding this principle of energy production isn't even very new one, because the energy released during nuclear fusion or fission also utilizes the metastable states of matter remaining during non-equilibrial explosions of stars into supernovae. The atom nuclei which we are living with want to fuse or split and only small energy input (introduced during implosion of nuclear bomb for example) is able to release much higher amount of energy on demand by making it more reversible. The general way how to make metastable system reversible is to establish/enforce lower or higher-dimensional routes for energy sharing and thermalization in it.

    If I recall correctly the recombination of atomic hydrogen is one of the processes able to make hydrogen atoms transition to the Hydrino form, releasing energy with a positive balance

    Yes, there is a long essay by Nicholas Moller on Langmuir's study on the atomic hydrogen torch. It appears likely that this was the first discovery of excess energy from plasma discharge. But such an excess energy has been observed in another systems, even these ones involving rare gases (Papp's engine), which are already monoatomic.

    traditional physicist will spend rest of their lives juggling reaction formulas, reaction chains, inventing new particles just to maintain 0 energy balance.

    I can agree, but Randell Mills is not different in this respect. Occasionally these physicists were proven right, for example the proposal and finding of neutrino was solely based on energy balance. Ironically, it's just the neutrinos, which violate this balance at the case of solar neutrinos again. It's worth to note, that Langmuir experiments were quite extensive and thorough, done in evacuated vessels, monitored with spectra. The formation of hydrino would hardly evade attention there. The factor speaking for hidden particle is, the latent energy from hydrogen recombination can be transfered from place to place. In this moment the hydrino hypothesis is still in the game, but my raising suspicion is, that we are actually facing some overunity effect here. In my guess the probability of fusion is 20%, hydrino 30%, but overunity 50%.


    I imagine flu is also related to cold weather like are high wheat price.

    The historical sources noted, that outbreaks of the Plague were caused by foul-smelling "mists". Those mists frequently appeared after unusually bright lights in the sky - i.e. the auroras. IMO the solar storms induce the nucleation of water vapor and formation of smog, i.e. many but tiny droplets, inside of which the viral particles can spread easier. The high smog concentration also results in "fouling smell mists", as the medieval cities like London were already flooded with coal stoves.

    Attack to LENR can be explained by many facts, from pathetic errors, experimental difficulties, wrong category of experimentator used as references, patent wars, budget fights, theoretical groupthink, incoherent answers to theoretical questions, awful communication, community battles, politicians involvement, journalist prides...

    These explanation may apply to the dismissal or ignorance of any other findings - but we can note, that the intensity of denial is proportional to number of people threatened with this finding. The cold fusion competes the research of many methods of energy production, conversion, transport and/or storage which would be replaced with fusion once it will succeed, and the energetic research (from nuclear or wind plant to batteries) involves more than half of contemporary applied research. Paradoxically the plurality of energetic research represents the largest obstacle of introduction of really effective solutions, because too many people in this research just want to keep their jobs.


    I think you might be mistaking principal quantum number n, which defines the excitation or the distance

    Yes, that's correct. The l-quantum number increases the number of nodes of quantum wave at the perimeter of quantum wave, not diameter of it. But until the constant of Coulomb force doesn't change, it requires to increase the diameter of electron orbital accordingly.


    Except that very same pattern exhibit all cold fusion researchers, including these most trustworthy ones. EvenHagelstein & Schwarz don't offer their NANOR commercially anymore.

    This is because the premature revelation of IP can be as damaging at its future market, like the losing the support of investors at the case of delayed research.

    Therefore the people who have no working cold fusion technology yet behave in the same way, like these successful ones, who are hiding their results from fear of competition.

    uyP0Mww.png v2hmuyt.gif

    Note that the Gartner hype curve pictured at the left is remarkably similar to the dark matter profile within galaxies and IMO it's typical behavior for findings,

    which utilize extradimensions (less or more hidden connections) of established paradigms and thus behaving like the dark matter of mainstream.

    The finding and improvements which utilize already well accepted and established paradigms exhibit much more convergent expectation curves similar to potential of normal gravity field.

    For me the recipe for successful formation of Rydberg condensate must be exactly the opposite: i.e. the simultaneous excitation of many atoms at the same energy level and quantum number, so that they can remain entangled mutually in such a way, the energy levels for their mutual interaction remain all the time lower, than the steps between energy levels of ionization energy of individual atoms. Every atom at different energy level would act like the poison of Rydberg state and it would destabilize such a condensate in wide neighborhood of it. Which leads into isothermal excitation of very cold boson condensate established from its very beginning. Actually the existing mainstream experiments mostly utilize similar strategy based on careful excitation of well cooled boson condensates with chirped microwave pulses of gradually decreasing frequency. Such a chirped pulses could be established inside the plasmoids formed with laser pulses, which would behave like the fast expanding resonators.

    The l-quantum number generally increases the distance of electron from center of atom, i.e. it's excitation energy number. The problem here it, the more distant electron gets from proton, the less stable its path gets, because the steps between ionization energy levels decrease with distance. Another problem is, such an electrons becomes more susceptible to spontaneous interaction with charge of another electrons or atom nuclei, because it gets localized and atom gains magnetic momentum. Therefore the Rydberg atoms of higher quantum numbers aren't able of stable life without supporting microwave field and the practical attempts for Rydberg atom preparation always consist of equilibrium between pumping of atoms inside well tuned and stable microwave field cavity and their spontaneous de-excitation.


    The collision with neighboring atoms and ions have no hardwired mechanism for increasing of l-number and distance of electrons from atom nuclei - as these collisions may result into acceleration of de-excitation as easily. On the contrary, in serious, i.e. well controlled experiment the successful preparation of Rydberg states requires their purification, i.e. the fast separation of atoms in lower quantum numbers, which serve as a quenchers of Rydberg states by their removal from Rydberg condensate with magnetic trap. Therefore I can see no apparent mechanism for spontaneous stabilization of Rydberg atoms with surrounding matter, once the supporting microwave field gets disabled. If these observations are real, then there must be some additional physical trick or mechanism, which I'm not aware of in this moment.


    Rydberg hydrogen is the name for excited hydrogen atoms, which can also form in a plasma

    The Rydberg matter which prof. Holmlid is talking about are atoms with circular orbitals similar to hydrino or Bohr model of atoms. They result from careful excitation of atoms in such a way, the electrons will remain on the verge of full ionization.

    The formation of such an atoms has been described multiple-times, they just require rather complex magnetic traps devices and precisely tuned masers (i.e. sources of microwave spectrum, where these atoms absorb) for to prepare them in high yield.

    These atoms are characteristics by their large size (hundreds of nanometers), because the electrons revolve the atom nuclei at very large distance like planets in solar system. Because they're propagating slowly, their wave character is suppresed and

    these electrons really revolve rather like particle (followed with its pilot wake wave of vacuum), rather than wave undulating across whole atom.
    MZGySda.gif BorromeanRings_500.gif

    These properties may look exotic for someone, but they all follow from classical quantum mechanics and therefore it's nothing very strange about it from this perspective.

    Therefore what I don't understand is, why such atoms should form very compact and dense stable matter, while they're expanded and very fragile instead and such an atoms shouldn't form with crude experiments with laser beams.

    One clue may be just in the fact, the electrons within these atoms are poorly delocalized, so that these atoms exhibit very strong EM momentum. From this reason they should radiate strongly into outside and being prone into fast decay.

    But the same behavior could lead into strong cohesion of atoms due to London dispersive forces. Therefore it's possible, that these atoms are present inside the ball lightning, which are often conspicuous by their orange or red color.

    Due to low energy of electron transitions within Rydberg atoms, these atoms should also radiate close to infrared spectrum rather than with black body spectrum.

    But even after then both density, both stability of Rydberg matter is nothing spectacular (the ball lightning decay fast ) so the Holmlid experiments would require additional physics for their full explanation.

    One clue may follow from dense aether model, in which the entanglement is of scalar wave shielding nature and the low-dimensional artifacts should get entangled stronger.

    From nature of Rydberg atoms follows, their electron encircle them along circular paths rather than spheres, these atoms therefore tends to be very flat. Also the dipole forces mentioned above would get enhanced at the plane of electron rotation.
    Another possibility is, the Rydberg orbitals of these atoms are intertwined into form of Boromean rings or similar artifacts, which would increase their stability and density even more.

    Such an atoms would get entangled within flat lattice, which could be therefore more dense and compact, than the normal entanglement allows.
    This is the only possible option - because the Rydberg atoms are otherwise sparser, than the normal atoms - so that they shouldn't form very dense form of matter.


    We will have to disagree about evolution. And about Popper. If a theory consistently makes non-trivial predictions which are validated it has merit.

    Popper's methodology is completely symmetric in this point - it's just about falsification. The negative hypothesis, i.e. the dismissal of facts like the evolution or cold fusion is also hypothesis, and as such subject of another doubt and falsification. In this respect the creationism has no advantage over evolution.

    I'm myself supporter of evolutionary theory and I'm even proposing the ways, in which the terrestrial life could emerge from solely physical systems. My ideas get already gradually vindicated with mainstream science. So far so good. But has such an evolution start just at the Earth, after then? Why it should, if the universe is much older? IMO such a finding just supports the Fred Hoyle panspermia hypothesis, according to which various viruses rain from space all the time. He tried to prove this theory with coincidence of influenza waves and solar cycles, pushing the viral particles toward Earth. It's just another example of taboo and pluralistic ignorance in science, because the research of Hoyle and Wickramasinghe has never been attempted to extend or just replicate. It was simply ignored.


    IMO the terrestrial life evolution can be still affected for example with raining of viruses from cosmic space - i.e. with "creationist act" of the rest of Universe - and I don't even talk about multiple evidence of possible visits of extraterrestrials from the opposite end of evolutionary chain. Not to say, that Darwinian theory itself in its consequences leads into horizontal gene transfer and another features of Lamarckian evolution - every theory therefore has its dual side, which denies its own postulates.


    And again you leave out the possibility that replication attempts fail because the original result was erroneous.

    Failed or not, it must get published in standard way - or nothing like this did ever happen and the pluralistic ignorance takes place. Work, finish, publish.

    I'm aware that mainstream journals avoid the publishing of negative results, but such an ignorant attitude is already the intrinsic part of pluralistic ignorance mechanisms.


    I'm not sure I understand how the rest of your post applies to LENR

    The ignorance of cold fusion has way more general roots than just LENR - it's omnipresent mechanism, which applies in nearly all areas of science. If you want to understand it, then the analogies outside the LENR research scope may be useful.

    But, surely, that could equally be just that the claims are too weak to pass peer-review?

    This consideration is already contained within pluralistic ignorance being subjectivistic. The existence of attempt for replication indeed doesn't imply the conviction about factual existence of effect in question at all.
    Such an attempt for replication can be even completely dismissive, i.e. with negative result - but it must be done and published. No attempt means no actual interest about subject.

    The actual conviction can come much later. After all, even by now - after 200 years after Darwin - about 60% of USA citizens doubts the evolution and this is even quite correct stance.

    You should be never sure and satisfied with any theory or observation, as Popper methodology teaches us. The methodology of science is about falsification, not confirmation of theories.

    But the absence of falsification can be replaced in no way in it - and this is just what can be interpreted like the pluralistic ignorance.

    Why scientists dismissed to look through Galielo's telescope? Did they because his claims were considered too weak in his time?

    Nope, such a causality is reversed: they dismissed to look for evidence, because they just wanted to have it weak. And this is indeed the difference.


    How can you distinguish between what you imagine - generally dismissive attitude for their replications - and real scientific uncertainty caused by results

    It's actually very simple: the pluralistic ignorance can be objectively measured like the temporal delay between anouncement of findings and its first published attempt for replication. The disinterest of mainstream science can be measured like the delay of first peer-reviewed publication analogously. According to this metric the verification of heliocentric model has been delayed by 160 years, the replication of overunity in electrical circuit has been delayed 145 years (Cook 1871), cold fusion finding 90 years (Panneth/Petters 1926), Woodward drive 26 years, EMDrive 18 years. Please note, that the finding of for example graphene (which wasn't also expected in any way) was immediately replicated in hundreds of labs across the world and after six years it has been awarded with Nobel prize.

    BTW Dr. Michael Dittmar, researcher at CERN, talks about the energy crisis and Big Oil

    Mainstream physicists often accuse the Big Oil lobby from low investments into hot fusion, renewables or nuclear research - but nothing would convince them into interest about cold fusion anyway.

    Each party in the energy research club simply follows its own particular interests instead of interests of tax payers who are paying all this fun - this is the whole problem.


    I just said, that reference to lack of evidence cannot serve as the evidence of the opposite - but evidence of pluralistic ignorance. For example ENEA lab achieved the reliability 70% of cold fusion at their palladium samples in 2009 - by now it will be probably even higher.

    The yield for example in CPU production is generally lower than 70% - and yet nobody doubts their existence. Piantelli reported nearly 100% reliability of LENR with his technology (nickel whiskers). Szpak also referenced co-deposition as very reliable system.

    Yet his iconic observation of cold fusion with thermocamera wasn't attempted to replicate yet - is it really the problem of Szpak or even LENR itself? Of course not, the problem is in scientific community itself. The direct observation with thermocamera is more apparent evidence of cold fusion, than many people are willing to admit.

    The general understanding of motivation for LENR dismissal with scientists is generally not very deep even here, at the LENR forum. In my opinion this motivation is the jealousy, fear for lost of social credit and informational monopoly - all other well minded opinions are just a sort of evasions of this stance. Yes, the LENR generally lacks the reference system, but only because of generally dismissive attitude for their replications. The fully working nuclear bombs were developed in just five years from first observation of nuclear fission in the test tube - but this research was governmentally supported and it got full financial and intellectual support (for example the USA government supplied its strategical mintage reserves of silver for calutron windings).

    This is what the real research effort means and such an effort would also yield into tangible results soon. Any other approach is just silent delay of progress in an effort to prolonge existing status quo.


    TEs, that Rossi (Leonardo Corp, Leonardo Technologies Inc.) provided the DOE (Dept of Energy), were in fact *never* tested.

    This doesn't prove the guilt of Rossi at all. Such a testing is usually part of contract, which was never signed. This is like to say, that 1 MW unit provided to IH was never tested - why it should be, if it was completed just on IH demand.

    These technologies aren't borrowed from stock piles, but developed according to customer needs.

    LENR implies an alternative scientific paradigm that is just as persecuted as religious heresy and perversion has been in the worst historical examples of this reaction. It is almost as if the currently established scientific beliefs are religious in nature and these divinely revealed beliefs must be protected with extreme zeal where the ends justify the means just like divine revelation must be protected from the mischief inspired by the devil. The opposition that Galileo Galilei, Nicolaus Copernicus, and Charles Robert Darwin faced are born anew in a religious frenzy that LENR inspires with as much venom and misdirection as ever in this dawning of this new LENR age..

    Nuff said. [Attempt at doxxing a forum member removed.] I just see ironic, when people who are lying about their identity get impertinent enough for to accuse other from frauds and lies.

    My stance is, the public critics of particular publicly known persons should remain also publicly known - or their opinion has no merit from ethical issues.

    Please respect other forum members' desire for anonymity. Eric


    The reason the NANOR does not work for this purpose is that it is very difficult to do reliable calorimetry on such small qtys of power

    The idea of NANOR is in doing calibration in situ with current pulses, which would change its heat production in defined way and which is claimed to be most reliable.


    The topic we were addressing: is there a "lab rat" experiment that can be used to gradually characterize LENR?

    Nanortech is using exactly the same term at its web page. This page also says, that "Nanortech anticipates it will be setting up a pre-order list by Fall 2016. Unfortunately, there is not at present the capacity to make these components generally available in the short term".

    The problem persisting with research of cold fusion is low quality of replications.

    That is to say, the researchers are overly inventive and even at the case of replications they attempt to simplify or modify the original protocol.

    The specific situation with cold fusion, which has strong economic incentives is, every just a bit useful know-how gets classified with researchers itself, which indeed doesn't help the situation.

    This both makes the cold fusion research even less reproducible, than it really is.


    Rather like democracy, the forum discussion process is not perfect, but right now - like democracy- it's the best bad model we have

    I didn't comment the discussion forum as such - rather the linear structure of it.
    For example, at reddit the threads have tree like structure, which has for example the advantage, various frog&mice battles between close groups of people will soon disappear from sight of most users of forum,
    because the number of nested levels rendered at page by default is limited.

    That is to say, the linear forum isn't good engine for brainstorming, once more people get involved at the single moment.

    Of course, at the case of lengthy threads the problem with navigation through history and repetition of the past reemerges again even at the branched forum, once it gets paged due to its scope.

    Another particular problem of this forum is rather bad full-text search. For example, before some time I posted here this list of links - now I'm unable to find it at this forum.

    The forum has no export feature of complete list of posts - so I'm forced to link it from another site or from my private archive of posts.


    Not really on topic

    On the contrary, it looks like perfect improvement of navigation structure.

    It's indeed possible: the surface tension of small grains poses high internal pressure to lattice, which decreases the melting point of metal.
    On the other hand, the saturation of hydrogen and internal pressure of lattice to its clusters should get higher too from the same reason.


    Claims of having a lab rat experiment are very different than actually having one. Can you refer us to various groups that have made use of NANOR's kit to gradually characterize LENR? (Preview of your answer, should you answer directly: you can't.)

    Such an objection smells with pluralistic ignorance, which is red herring behind one century long dismissal of cold fusion as such with mainstream physics.

    The proponents of this physics claim, that cold fusion doesn't exist, because nobody did succeed with it. ...?!? Well, OK - maybe he succeeded, but it was long time ago and no one has replicated it yet. Well, maybe he already replicated it, but he hasn't published it in serious scientific journal! Well, shit - maybe some journals about it still exist, but they're not peer-reviewed...!! ..Jeez, some of them are possibly reviewed - but they're of low impact and they didn't pass the scrutiny yet - that's it!!!

    And so on - at the very end every evidence of cold fusion will get diluted with unwillingness of physicists to engage in serious replications of cold fusion and the vicious circle of pluralistic ignorance gets closed: the cold fusion doesn't exist, because - you know - no one is willing to try it. But the objective truth cannot depend on our subjective attitude.

    In dense aether model the principles of fusion and energy formation are similar at all distance scales, as its geometric effect of leveling of space-time curvature. From this general perspective the LENR theory is similar to recent models of leaking energy from black holes via low-dimensional jets and worm holes. It's important to realize, that the visible matter is metastable: it has been formed during explosions of supernovae in similar way, like the black holes were formed during collapse of quasars. We are living and floating as a thin layer of chemical elements at the molten iron ball: the elements heavier and more lightweight than iron/nickel shouldn't be there, they remained only because these explosions weren't fully equilibrial and they just wait for the opportunity to finish this transform. The atom nuclei of different size therefore have tendency to merge and fuse, which is delayed with surface tension of strong positive space-time curvature at their surface. Their coalescing would require the temporal formation of thin narrow necks of the opposite space-time curvature in similar way, like during merging of mercury droplets shaken inside the test tube, but in essence it's negentropic effect resulting from metastable system.

    I even presume, that overunity effects observed within ferromagnetic systems and elsewhere have the similar origin: the observable matter is metastable in contact with vacuum and it tends to equalize its energy content with it. Due to surface tension of matter this equalization can run only at the places, where this surface gets broken with hyperdimensional interactions in similar way, like the black hole evaporates only at the places, where the event horizon gets scratched. At the very end we are generating the energy just by accelerating the evaporation of matter temporarily formed during gravitational collapse back into radiation by punching of worm holes of negative entropy and space-time curvature in it.


    Cold fusion has many possible explanations, only some of which involve fusion

    There are multiple nuclear reactions, from this reasons I'm focused just to the molten-lithium / deuteron system, which is free of lattice artifacts and various complexities and it definitely produces helium. Within metal lattices the 1D effects at the nuclear level can be complemented with another 1D artifacts at higher fractal levels: the molecular orbitals, nanocracks, whiskers, laser beams and so on - but the low-dimensionality is always the key of my approach to LENR catalysis. The fusion can be understood in general sense, once the heat energy gets formed with combining of more/smaller atom nuclei results few or large one (i.e. opposite to fission). It may not include the strong nuclear interaction.


    Your reply to my points is general

    This is given just by broadness and complexity of subject. The development of LENR theory is like the hoarding the cats: I'm struggling to find the common ground, which everyone could agree with. From the same reason I cannot agree with various AxilAxil claims, that nanocracks, surface plasmons, superconductivity, dense hydrogen, Bose condensates, Rydberg matter, monopoles, etc. are the KEY for LENR, Whereas in particular LENR systems these concepts may be heavily modified or even completely missing, I prefer to explain, how these particular insights may get linked with more general principles: dimensionality of system, because this may be the only remaining connecting point of multiple LENRs. Once we get more detailed, we will lose the general grasp of subject.

    During research of complex emergent (i.e. hyperdimensional) phenomena the coherent linear character of discussion forums may be suboptimal, because many useful insights gets lost fast in their history. My experience with overunity forums is, only first few pages are really contributory, the rest of threads is merely just a reiteration of their beginning. The tree-like organization of discussion threads with wordmaps visualization may be more effective here.

    It's worth to note, that Thermacore replications could be extended with introduction of all tricks known from palladium-electrolytic systems: the co-deposition of nickel with hydrogen, with application of magnetic field and HF AC field. And nothing prohibits us to reduce also palladium and lithium ions within the system together with hydrogen (which is originally based on potassium carbonate electrolyte only). Therefore there is still lotta rather simple and cheap factors for improvement.

    Another modification may include the usage of molten eutectic solute of lithium hydride instead of aqueous solutions.

    You seem to be arguing that MeV energy photons will somehow be channeled and thermalized by scattering off of electrons in condensed matter, despite the copious evidence that the opposite of this is what generally happens, namely that even heavy metals are largely transparent to photons in this range of energy.

    The copious evidence of cold fusion is exactly the opposite: not only the Coulomb barrier gets broken, but also symmetric process happens: the resulting energy gets miraculously dissolved within atom lattice. We can observe the similar behavior in many 2D surface catalysts of 3D reactions: not only activation of energy gets lowered, but also the energy of reaction gets diluted, for example during burning of hydrogen at catalyst the achieved temperature remains significantly lower, than inside the hydrogen flame. Therefore I consider analogous just even more pronounced behavior for 1D catalytic system, where the resulting energy gets channeled along 1D systems, which exhibit even higher surface/volume ratio for thermalization than the 2D surface catalysts. This situation must happen - or we would have no cold fusion controversy to solve (compare the Huizenga's "Three Miracles of Cold Fusion").

    The metals are transparent to gamma ray photons, because the atom nuclei are smaller and of higher energy density, than these photons, whereas the deBroglie wavelength and energy density of electron orbitals is much lower instead. The artifacts of similar energy density and size interact mutually most effectively. But during cold fusion collisions the temporal phase of averaged energy density may be formed, which would be effective just for absorption of these photons. The formation of this phase would also important for explanation of Coulomb barrier breaking, so we would have two problems solved in a single moment. In addition, the situation with gamma rays during cold fusion is different, as these photons originate from centers of atoms - not from outside of them. Their energy density will be lowered from the very beginning of their spreading, because they would be formed within already very dense nuclear matter entangled with electron orbital matter wave. The assumption of superradiance in Hagelstein theory and the formation of heavy electrons and slow neutrons of Widom-Larsen theory has its origin right here.


    The QM argument against Hydrinos goes back to other difficulties.

    IMO the truth is somewhere in the middle: the -s orbitals is neither thin hollow sphere (as R. Mills is drawing it), neither fuzzy sphere with highest probability density at its center - but something inbetween: the sphere with fuzzy hole at its center. When two or more such orbitals collide, the hollow cylinder or waveguide will get formed.

    fo1ZPtH.gif jTwtyvk.gif jrPnVU9.gif


    there is no "lab rat" experiment that can be easily replicated, with results well above the noise floor

    This is misinformation - such an experiments are already there: for example prof. Hagelstein and Swartz from MIT sell NANOR® kit, which enables to start the cold fusion experiments with 100% reliability. And there are another simple systems, which already provide very good reproducibility (codeposition of palladium for example).