Fact Check, debunking obviously false information

  • I disagree that a magical and not explained force at a distance (Newtonian gravitation) is a physical cause.


    May be you got one apple to much on your had...


    A physical cause is per definition something you can measure in reality. As a such gravity is very real. QM is just an engineering framework to map potentials. Based on a very good base measurement you can make approximative calculation how unperturbed potentials are distributed. QM alone has no value at all like a sheet music you need an instrument = base measurement to get something useful out of it.


    Unluckily QM only works for conservative particles/forces due to it's restricted math. QM already fails for hydrogen as it does not model the 1/r3 magnetic "electron potential interaction" that behind the scene changes the Coulomb potential function.


    QM/QFT never worked for anything except when tons of fudging factors where used. It's famous predictions were based on a few excellent measurements, where on the 1 bit x 1 bit x 1 bit permutations rules were applied...


    We still wait for THH to present us the marvelous QM/QFT formula for the up quark that exist about 1024 times in one gram of Hydrogen... My be he could start with the down quark that is known with a little more than 2 bit precision what means you can choose one apple out of 5...



  • I don't think you appreciate that the tick times - migration of electron - reveal that the electron is real before a measurement is made. It is a brutal blow, and you seem not to realize it. When I talked to Brett about this he expounded the end result of these conversations is to ignore the data or say it is "faked."


    Of course, if you have an explanation we will listen - but I doubt it.

  • Electron revolving atom nuclei is attracted to nuclei by Coulomb force. For its separation from nuclei and expansion of atom one must exert an energy and the resulting state - expanded electron orbital - will thus exhibit excited quantum states and higher quantum numbers than fundamental quantum state. The expanded orbital gets increasingly spherical and round - it leads to so called Rydberg orbital resembling thin spherical shell.


    For electron trapped inside liquid helium bubble the situation is exactly the opposite. The surface tension of bubble exerts strong pressure to electron and the electron is forced resist this pressure by its bouncing, i.e. vibrations across bubble cavity. The smaller the bubble is, the higher its surface tension gets, the more its electron gets squeezed, the higher quantum states and nodes / numbers its wave function gets. And the bubble also gets increasingly round and spherical - except that it gets smaller during it. It resembles Mills hydrino model - but its fully classical quantum system described by normal wave function - and it also consumes an energy for its formation - not produces.


    Therefore it's completely normal and fully compliant with quantum wave mechanics, that electron bubbles in helium shrink with increasing quantum numbers. Both Mills both these ones, who tried to explain this dependence classically completely missed the opposite geometry of the whole system. But this case also illustrates, that when you're experti in math enough, you'll invent seemingly working math for whatever nonsense thinkable: even for physical system based on solely perverted perspective. The contemporaries of Ptolemy and his epicycle model could serve as an iconic model of confused science, Euler also had no problem with formulation of bulletproof theory of hollow Earth, and contemporary cosmologists could tell similar stories about their "expanding Universe" model. The autistic formally thinking people are particularly susceptible to inverted perspective illusion - which schizophrenics can rarely get fooled with.


    Mathematicians often exhibit autistic traits (one can even spot them by shape of their head 1, 2, 3, 4, 5, 6..), so that they will not only develop formal models easier - they will also more often confuse them.


    Edited for inappropriate language. Shane

  • Quantum field theory and QM are fun/work for a select few.

    QM texts are said to be convergent and QFT texts are divergent.

    Maybe this indicates paradigm juvenility,(thanks B2 ) ..

    or paradigm maturity ....or senility ?


    As to divergence.. QFT has lots of scope for young ideas

    according to this Gross- chaired discussion in Brussels .

    https://livestream.com/streami…/8742238/videos/193714289

    One rhetorical Q/A from David Gross .. for the under 35s ..TM 4.27


    Question What would you like theory to achieve in the next 30-40 yrs?

    Answer   I mean I would like to have been able to

    calculate the masses of the hadrons.


    The mass of the proton...the mass of the neutron?2019? QM? QFT?

    Perhaps there are non QFT theories that can calculate in 2019?


    I am perhaps not as romantic or as alive as Goethe.was.. when he wrote

    "All theory is grey, my friend, but forever green is the tree of life"


  • Quote

    Perhaps there are non QFT theories that can calculate in 2019?


    There are many of them in fact - Heim's theory, Kornowski theory or Nigel B. Cook theory (apparently the best one). The way in which they're consequentially ignored also illustrates strictly occupationally driven progress of contemporary science: all new ideas and findings are considered and accepted only when they don't threat existing grants, jobs and their social credits. Densely overcrowded scientific community doesn't make any exceptions from this rule up to level, we could speak about definition of scientific ethic.


    EbxlWd0.gif

  • Quote

    This concept could yield chips that continuously produce electrical power with an areal power density that is 3x greater than raw solar irradiation on Earth.


    The rectification of thermal fluctuations, which are present experimentally and analytically, can generate work when combined with an increase in the entropy of quantum information arising from spin transport onto the PM center due to its spin fluctuations. HOW it could do it without violation of thermodynamics? XXXXXXXX physicists (who delayed the overunity research for decades) are welcomed into discussion... The ignorance of their censorship might seem innocent at first, but it delayed progress of human civilization by at least one century, as everyone's fun comes at price of the rest.


    If you stop using derogatory language, we won't have to "censor". Now please... Shane

  • Unzicker and Gross have a nice discussion Aug, 2016


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

  • Alexander Unzicker is German physicist researching "Einstein's (?)" concept of variable speed of light In his book, “The Higgs Fake“ he states:


    It (the book) is written for the young scholar who wants to dig into the big questions of physics, rather than dealing with a blend of mythology and technology.


    Why we measure two different values for the radius of protons? Two Higgs masses? Two values of Hubble constant? Two lifetimes of neutron? How Standard model actually "calculates" things illustrates the recent discussion about proton radius discrepancy. The radius of proton is determined with dozens of QED corrections, many of them were chosen arbitrarily and their experimental errors are still quite large. This is epicycle approach on steroids - but it's tolerated or even promoted, because just this approach promises as many jobs for its proponents, as possible.

    d1ITLTa.gif


    Of course, more intelligent and way less grants/job demanding/promising approach is also possible. We don't need Standard model parasites for anything useful:


    How Cook calculates proton and neutron mass. Also muon mass (with at least 6 effective decimal places of precission). Not bad at all - so why we should pay for theorists, who cannot calculate it?

    Self consistent automatic QFT predictions of particle masses

    Derivation mass of proton from Machian principles.

    Derivation of α and particle mass by Randell Mills (via 1, 2)


  • Quantum field theory and QM are fun/work for a select few.

    QM texts are said to be convergent and QFT texts are divergent.


    QFT is the (much more difficult) formulation of quantum mechanics for fields. Why more difficult?


    One way to look at this is that classical -> QM turns a particle into a probability distribution with phase and wave-like characteristics


    That is tough for people who don't want the universe to be different from what we expect, but still manageable. And the maths from it is straightforward, although obviously much more complex than Newtonian (or similar) many body dynamics, because analytically defined forces are replaced by integrals, and a finite number of coordinates is replaced by a continuous wave function.


    So QM would seem to allow many more degrees of freedom although, ironically, what it actually does in any finite system is to reduce the degrees of freedom because those complex wave functions are uniquely determined, up to phase, and the wave boundary conditions force quantisation on what would otherwise be unconstrained position, momentum, etc.


    That deals with particles, but leaves hanging the question of those fields - representing action at a distance. It is pretty a pretty obvious step to see that fields carry energy and therefore should perhaps be quantised too. A host of detailed experimental results pushed this: lamb effect, hyperfine structure, departure of g from that predicted without QFT.


    Actually though I'd like - with the enormous benefit of hindsight - to offer and even more compelling reason for wanting QFT. If you quantise fields it turns out that the very complicated math you need to calculate results can be summarised in beautifully compact and simple Feynman diagrams. And, those Feynman diagrams have an even more beautiful and intuitive physical meaning as virtual particle exchanges.


    Intuitively that enormous number of extra degrees of freedom in fields, when quantised, can be visualised as the exchange of particles where the exchange carries momentum and hence conveys force. And finally, particles themselves can be seen as excited states of the underlying quantised fields.


    In 1928, Dirac wrote down a wave equation that described relativistic electrons — the Dirac equation. It had the following important consequences: the spin of an electron is 1/2; the electron g-factor is 2; it led to the correct Sommerfeld formula for the fine structure of the hydrogen atom; and it could be used to derive the Klein-Nishina formula for relativistic Compton scattering. Although the results were fruitful, the theory also apparently implied the existence of negative energy states, which would cause atoms to be unstable, since they could always decay to lower energy states by the emission of radiation.[6]:71–72

    The prevailing view at the time was that the world was composed of two very different ingredients: material particles (such as electrons) and quantum fields (such as photons). Material particles were considered to be eternal, with their physical state described by the probabilities of finding each particle in any given region of space or range of velocities. On the other hand, photons were considered merely the excited states of the underlying quantized electromagnetic field, and could be freely created or destroyed. It was between 1928 and 1930 that Jordan, Eugene Wigner, Heisenberg, Pauli, and Enrico Fermi discovered that material particles could also be seen as excited states of quantum fields. Just as photons are excited states of the quantized electromagnetic field, so each type of particle had its corresponding quantum field: an electron field, a proton field, etc. Given enough energy, it would now be possible to create material particles. Building on this idea, Fermi proposed in 1932 an explanation for β decay known as Fermi's interaction. Atomic nuclei do not contain electrons per se, but in the process of decay, an electron is created out of the surrounding electron field, analogous to the photon created from the surrounding electromagnetic field in the radiative decay of an excited atom.[3]:22-23

    It was realized in 1929 by Dirac and others that negative energy states implied by the Dirac equation could be removed by assuming the existence of particles with the same mass as electrons but opposite electric charge. This not only ensured the stability of atoms, but it was also the first proposal of the existence of antimatter. Indeed, the evidence for positrons was discovered in 1932 by Carl David Anderson in cosmic rays. With enough energy, such as by absorbing a photon, an electron-positron pair could be created, a process called pair production; the reverse process, annihilation, could also occur with the emission of a photon. This showed that particle numbers need not be fixed during an interaction. Historically, however, positrons were at first thought of as "holes" in an infinite electron sea, rather than a new kind of particle, and this theory was referred to as the Dirac hole theory.[6]:72[3]:23 QFT naturally incorporated antiparticles in its formalism.[3]:24


    It is easy to look at details here and find the analogies weird and unappealing. And, QFT took a long time to develop because when you try to do it the maths just does not work. You get those infinities which can be removed by the "trick" of renormalisation of the mathematically sound and insightful idea of effective field theory and regularisation. EFT is a very physically intuitive process which admits limited knowledge, and supposes that current theory is the low energy approximation of some more complex high energy theory. Regularisation then removes the singularities that would actually be removed by the more complicated and detailed description. I remember in the 1970s being profoundly unhappy about renormalisation as then taught, even though it worked!


    QFT is not a single theory - it encompasses whatever particles and symmetry groups you like to through at it. So it is the (quantum) foundations for all of the feasible modern ideas about subatomic physics. Its development together with the discovery of the standard model particles has been the most profound achievement of physics because it so precisely interlocks experimental and theoretical results. without experiment, we would not know which of the many variants of theory are possible. For example, we would certainly be liking the idea of supersymmetry - so neat - if it had not been for the LHC. In fact that result (nearly now established) is the key new physics outcome from the LHC since supersymmetry was previously the preferred way to go. It is negative, but cutting off that possibility will allow others to develop.


    On the theoretical side the vector bosons, predicted by electroweak unification with precise properties, would never have been found, because we would not know where in that mass of data to look, without Standard Model.


    Against this I have no sympathy for semi-classical theories that are not rooted in QFT. I have some sympathy for different models of the bits of the Standard Model that are least well explored experimentally - quarks and gluons - and the HIggs. Quarks and gluons are not so well explored because color confinement provides containment so quarks cannot be pulled apart. They - up and down quarks, which are relatively stable - can be observed in a nucleus from scattering data but not seen outside. That of course leaves open some more complex model for quarks and their interactions, but it must still predict point-like up and down quarks since these are observed. The Higgs particle (and the underlying Higgs field) is not well understood. Given that this field gives all other particles, though coupling, their masses, there may be very important new physics here - and there is certainly some new physics.


    The numerological coincidences tabulated by Cook (table, and see Zephyr's post above for copy of it) are fun because they fill this gap. It is possible that rest masses of particles can be revealed by some new physics mechanism - just as it is also possible that they will end up (unsatisfactorily) being the random results of broken symmetry from some high energy new physics.


    Cook's paper has some way out ideas, which I won't comment on, and is experimentally rooted in a set of particle mass approximations. Humans are good at making patterns out of random data, so we need guidance from statistics to work out whether the tabulated values are close enough to the experimental values for this to be more than data selection.


    To answer this we need to ask - OK - what is selected. From the table we see a key equation:

    35n(N+1) (derived as Me.n(N+1)/2.alpha).


    Here n = number of elementary particles (leptons) constituting the particle. N = a magic number chosen to make the mass fit.


    The table shows the leptons (3), the mesons (3 classes), and the baryons (4 classes).

    The electron (one of the leptons) does not fit this numerological formula and so instead of 10 close fits, we have 9 correct close fits, and one miss.

    The close fits vary from approx +3% to -3% (a 6% range). The choice of N gives us the ability to match - more so at high N where the heaviest particle, N=50, would automatically match 1% to -1% just be choosing N.


    A lot more analysis could be done, but based just on the bare bones of this table, it does not seem great. However there may well be other symmetries exposed here - and perhaps ones predicted by SM.


    For example, the division of all particles into n = 1,2,3 would lead to rather obvious double or triple ratios. Unfortunately this does not work because each row (constant N) has at most one column (n).


    I'm also not certain that all particles are fitted into this table. Clearly the vector bosons are left out, and neutrinos. That leaves this as possible cherry-picking.


    Hadron masses can be correlated in a kind of periodic table summarized by the expression M= mn(N + 1)/(2*alpha) =
    35n(N + 1) MeV, where m is the mass of an electron, alpha = 1/137.036, n is the number of particles in the isolatable particle (n = 2 quarks for mesons, and n = 3 quarks for baryons), and N is the number of massive field quanta (Z bosons
    formed by annihilation of charged virtual fermions) which give the particle its mass. The particle is a lepton or a pair or
    triplet of quarks surrounded by shells of massive field quanta which couple to the charges and give them mass, then the
    number of massive particles which have a highly stable structure might be expected to correspond to well-known “magic
    numbers” of closed nucleon shells in nuclear physics: N = 1, 2, 8 and 50 have relative stability


    So we have here:


    A claim that the values of N are the only possible for stability, because N=1,2,8,50 have relative stability as nucleon shells in physics.

    My googling shows 2, 8, 20, 28, 50, 82, and 126 as magic number shells. What happened to 20 and 28?

    It is also not clear how this number, related to nucleons, can appear in this equation. You would need something behaving like nuclear shells, Cook claims crystallised virtual particle pairs, which i sort of get, magically scaled so that the total mass is multiplied by the number of constituent leptons, which I don't quite get.


    Well, none of this can be ruled out (by me, anyway) since we do not understand Higgs, but it is not strong, and also not properly explanatory. No probabilities.


    I quite like Cook's ideas in this paper. They are, at least superficially, rooted in QFT. They do not contradict the SM, and therefore are not obviously unrealistic, and they try to explain things not otherwise explained. However, the "extra sauce" they give - the particle masses - as from the analysis above is relatively weak evidence. So keep an eye on these ideas, some part of which might have some merit, but don't see any good experimental evidence favouring them yet, nor do I see any predictions that separate them from SM other than the undiscovered particles - which appears a minus since they should surely have been found being relatively low energy. I should also say that somone with a deeper understanding of particle physics than me might well dislike these ideas more.


    THH

  • Re Zephyr edited "mathematicians develop formal models easier, but often confuse them"


    Well: non-mathematicians perhaps don't confuse formal models. It does not much matter since they most certainly cannot understand them, when as with most of physics they are mathematical!

  • PS - earlier JohnDuffield seemed to think i was avoiding his references. If anyone else thinks I'm bottling replying directly to specific tech arguments here that are claimed to be contrary to Standard Model (rather than compatible extensions) please bring them to my attention.


    It would be nice to find something different from the SM that is as explanatory of known physics and makes new interesting predictions. I've not seen it here.

  • Zephyr: from the research paper related to your link


    To complement this heat description of our work, we briefly note in the Supplementary Note 10 that the rectification of thermal fluctuations, which are present experimentally and analytically, can generate work when combined with an increase in the entropy of quantum information 11 arising from spin transport onto the PM center due to its spin fluctuations. A quantum thermodynamical theory along a similar spintronic path has been proposed 8, while classical electronic implementations using capacitively coupled quantum dots have been demonstrated at low temperature 3,5,63.


    So to answer your question: correct, still no breaking of 2LoT but you need to include the entropy of quantum information in the budget.

    from [11] above: Work and information processing in a solvable model of Maxwell’s demon


    A system in thermal equilibrium undergoes random microscopic fluctuations, and it is tempting to speculate that an
    ingeniously designed device could deliver useful work by rectifying these fluctuations. The suspicion that this would violate the
    second law of thermodynamics has inspired nearly 150 years of provocative thought experiments (1–5), leading to discussions
    of the thermodynamic implications of information processing (6–12). Although both Maxwell (1) and Szilard (3) famously took
    the rectifying agent to be an intelligent being, later analyses have explored the feasibility of a fully mechanical “demon”. There has
    emerged a kind of consensus, based largely on the works of Landauer (6) and Bennett (7, 8), and independently Penrose (13),
    according to which a mechanical demon can indeed deliver work by rectifying fluctuations, but in doing so it gathers information
    that must be written to physical memory. The eventual erasure of this information carries a thermodynamic cost, no less than
    kBT ln 2 per bit (Landauer’s principle), which eliminates any gains obtained from the rectification of fluctuations.

    TANSTAAFL

  • Quote

    include the entropy of quantum information in the budget.


    Classical thermodynamics doesn't recognize something like "quantum information" or "quantum entropy" - just an "information".

    Apparently quantum mechanics doesn't fully play with classical law of thermodynamics. Maybe it's time to reformulate as it already did happen in the case of uncertainty principle.


    We can imagine the space-time like water surface every information always comes in two parallel ways: in form of longitudinal (underwater) waves and surface waves, which are of transversal nature. In real case, the surface waves are always of mixed character (which physicists are calling Rayleigh or Love waves, depending on whether longitudinal or transversal character of the wave prevails). Despite the weakness of underwater waves, this results in quantum uncertainty of every information, which comes to observer at water surface in two independent ways: via surface and underwater waves.

    oaUsisv.gif

    Only pin-point observer can see all things from extrinsic (outer) perspective only. Because every real observer is of finite size and it suffers by quantum delocalization, he can observe the same effect or artifact both from inside, both from outside perspective. The mixture of both these perspectives results into intrinsic uncertainty of every reality observation. Note that quantum mechanics is strictly based on extrinsic perspective in similar way, like relativity is based on intrinsic one. But pilot wave extends every observable object and event both in time, both in space and makes its consequences a bit more classical and intrinsic.


    Special relativity has opposite problem once insintric observations become just a bit more extrinsic and scalar component dependent. Evanescent (hyperdimensional) waves in Maxwell's theory lead to superluminal effects (Gurther Nimtz and Scharnhorst effect).

  • How Cook derives his N number is described for example here. From particle collider experiments, we know there must be a 3 fold microstructure within the proton. Gell-man used that 3-fold structure discovery to construct a model of quarks of 3 different flavors, to describe that 3-fold structure. Of course this is just a qualified guess, but these numbers are small natural integers. They're no way less qualified guess than for example number of extradimensions considered by string theorists.


    9PKedFG.gif

  • Zephyr: Why we measure two different values for the radius of protons?


    Because you either can measure the electric effect or the magnetic effect.


    In the proton we have the 3D/4D rotation mass structure that primarily couples at the magnetic radius (4D-radius), where as the whole proton structure couples at the 5D (5 rotations) charge radius -that is not the one of the magnetic moment.


    The measured 3D,t charge radius is the magnetic moment radius. (As calculated from the moment by SO(4) physics )


    The confusion in current temple physics stems from the fact that after 90 years QED/QFT these folks still have no clue about the internal proton/neutron structure. Thus it's up to their fantasy what they measure.


    Unluckily the quark (-model) went into cake that still is in the stove and needs some more centuries to bake out...

  • Classical thermodynamics doesn't recognize something like "quantum information" or "quantum entropy" - just an "information".

    Apparently quantum mechanics doesn't fully play with classical law of thermodynamics.


    Exactly: 2LoT is about (in the most general sense) information. Classical entropy is one example of this. Information - as stored in a memory - is another. It is beautiful and a very precise explanation of why all the proposed Maxwell's Demons do not work. And not a surprise, becase 2LoT is really just saying that the universe goes from more ordered to less ordered states, which is probabilistically so likely as to be effectively certain.


    There is no requirement for anyone to understand this: but anyone who works through the details - and I'll happily argue these - will end up understanding it.


    THH

  • Quote

    becase 2LoT is really just saying that the universe goes from more ordered to less ordered states,


    Why crystals form spontaneously during cooling after then? Why massive objects condense from random clouds into a spherical objects?

    It's nice to believe in laws but even better is to open an eyes and not to live in illusions.


  • From my link posted before https://www.sciencealert.com/a…-closer-to-a-solid-answer


    More details: https://www.quantamagazine.org…e-and-hope-dies-20190911/


    Randolf Pohl of the Max Planck Institute of Quantum Optics and collaborators had measured the proton using special hydrogen atoms in which the electron that normally orbits the proton was replaced by a muon, a particle that’s identical to the electron but 207 times heavier. Pohl’s team found the muon-orbited protons to be 0.84 femtometers in radius — 4% smaller than those in regular hydrogen, according to the average of more than two dozen earlier measurements.

    If the discrepancy was real, meaning protons really shrink in the presence of muons, this would imply unknown physical interactions between protons and muons — a fundamental discovery. Hundreds of papers speculating about the possibility have been written in the near-decade since.

    But hopes that the “proton radius puzzle” would upend particle physics and reveal new laws of nature have now been dashed by a new measurement reported on Sept. 6 in Science.


    After Pohl’s muonic hydrogen result nine years ago, a team of physicists led by Eric Hessels of York University in Toronto set out to remeasure the proton in regular, “electronic” hydrogen. Finally, the results are in: Hessels and company have pegged the proton’s radius at 0.833 femtometers, give or take 0.01, a measurement exactly consistent with Pohl’s value. Both measurements are more precise than earlier attempts, and they suggest that the proton does not change size depending on context; rather, the old measurements using electronic hydrogen were wrong.

    Pohl, who first heard about Hessels’ preliminary finding at a workshop in the summer of 2018, called it “a fantastic result,” albeit one that “points to the most mundane explanation” of the proton radius puzzle.

    Similarly, Hessels said he and his colleagues were very pleased that their measurement “agreed with the very accurate measurement in muonic hydrogen,” even if the result is somewhat bittersweet. “We know that we don’t understand all the laws of physics yet,” he said, “so we have to chase down all of these things that might give us hints.”


    The harsh reality is that the proton radius is extremely hard to measure, making such a measurement error-prone. It’s especially tough in the typical case where a proton is orbited by an electron, as in a regular hydrogen atom. Numerous groups have attempted this measurement over many decades; their average value for the proton radius is just shy of 0.88 femtometers. But Pohl’s group, seeking greater precision, set out in 1998 to measure the proton radius in “muonic hydrogen,” since the muon’s heft makes the proton’s size easier to probe. Twelve years later, the scientists reported in Nature a value for the proton radius that was far more precise than any single previous measurement using regular hydrogen, but which, at 0.84 femtometers, fell stunningly short of the average.

    The question is: Were all the measurements using regular hydrogen simply off — all accidentally too large? When I first corresponded with Pohl in 2013, the year he and his colleagues reported an updated muonic hydrogen measurement in Science, he emailed me a plot showing how, historically, measurements of physical constants have often drifted dramatically as techniques change and improve before converging on their correct values. “Quite instructive, no?” Pohl wrote. He was keeping things in perspective.


    Bottom line:

    • When muonic hydrogen measurements appeared to show a 5 sigma difference from electronic hydrogen measurements of proton size that was considered a big deal, SM says that muonic and electronic hydrogen must have protons the same size.
    • Now that more accurate electronic hydrogen experiments also show a smaller size, similar to muonic hydrogen, we know it is just that the original less accurate (scattering) experiments were less accurate.

Subscribe to our newsletter

It's sent once a month, you can unsubscribe at anytime!

View archive of previous newsletters

* indicates required

Your email address will be used to send you email newsletters only. See our Privacy Policy for more information.

Our Partners

Supporting researchers for over 20 years
Want to Advertise or Sponsor LENR Forum?
CLICK HERE to contact us.