Posts by Jarek

    I am not aware of any evidence for violation of energy or momentum conservation - they are at heart of Lagrangian mechanics we successfully use from QFT to GRT.

    Regarding virtual particles - they are used in Feynman diagrams in perturbative approximation of QFT - assuming QFT is fundamental, perturbative QFT/Feynman diagrams is still an effective picture - practical approximation ... leading to countless number of divergences, usually removed by hand.

    But it is extremely universal practical tool defining objects as point particles through their interactions - it is very general algebra on particle-like objects ... like algebra properly concluding that "apple + apple = two apples" without any insights what apple is ... which can also handle non-point objects like fields by approximating them with a series of virtual particles.

    The basic example is Coulomb interaction e.g. proton - electron, which in pertubative (approximation of) QFT is handled with a series of point-like photons, instead of continuous EM field.

    It brought dangerous common misconception that EM field is always quantized, while it is just a continuous field, which optical photons are quantized due to discrete atomic energy levels ... but e.g. linear antenna produces cylindrically symmetric EM radiation - which energy density drops like 1/r to 0 - cannot be quantized to individual discrete photons localizing finite portions of energy.

    We have lots of quasi-paricles especially in solid state physics - starting with phonons: classically just Fourier/normal modes of the lattices, but perturbatve QFT treats them as real particles ... point-like.
    There is also virtual pair creation - while we imagine pair creation as a zero-one process, it is in fact continuous - field can perform a tiny step toward pair creation, represented as real (virtual) pair creation in perturbative QFT. Continuity of this process is nicely seen using topological charge as charge:

    Particle like electron is more than just a wave packet - it is among others stable localized configuration (nearly singular) of electric and magnetic field:


    It doesn't loose these localized properties when approaching a proton to form an atom - becoming huge probability cloud of quantum orbital - this is proper but only effective description, averaging over some hidden dynamics.

    They can perform real acrobatics on magnetic dipoles of these electron, like Larmor precession or even spin echo:…on_paramagnetic_resonance


    Coupled wave created by internal clock of electron (de Broglie's, zitterbewegung, experimental confirmation: ) has to become standing wave to minimize energy of atom - described by Schrodinger, giving quantization condition. It is nicely seen in Couder's walking droplet quantization, nice videos.

    If lenr is possible, there is needed a non-thermal way to overcome the huge Coulomb barrier between nuclei - the only mechanism I could imagine is (localized) localized electron staying between nucleus and proton due to attraction - screening their repulsion.

    QM is profoundly different from local models. You cannot get out of this, and it comes from experiment not theory.

    Of course the idea of locality, which we are fixated on, does not apply naturally in a quantum domain. That has profound consequences for the structure of spacetime - it is just that we have as yet not properly worked out the connections!

    You are saying that if proton and electron are far apart they are "classical" corpuscular ... but when they meet they became "quantum" wave-like ... so in which moment/distance this switch happens?

    Where exactly is the classical-quantum boundary?

    Do we really need such switch? - maybe they are both at the time. Like in popular Couder's walking droplets with wave-particle duality (Veritasium video with 2.5M views, great webpage with materials and videos, a lecture by Couder, my slides also with other hydrodynamical analogues: Casimir, Aharnonov-Bohm). Among others, they claim to recreate:

    1. Interference in particle statistics of double-slit experiment (PRL 2006) - corpuscle travels one path, but its "pilot wave" travels all paths - affecting trajectory of corpuscle (measured by detectors).
    2. Unpredictable tunneling (PRL 2009) due to complicated state of the field ("memory"), depending on the history - they observe exponential drop of probability to cross a barrier with its width.
    3. Landau orbit quantization (PNAS 2010) - using rotation and Coriolis force as analog of magnetic field and Lorentz force (Michael Berry 1980). The intuition is that the clock has to find a resonance with the field to make it a standing wave (e.g. described by Schrödinger's equation).
    4. Zeeman-like level splitting (PRL 2012) - quantized orbits split proportionally to applied rotation speed (with sign).
    5. Double quantization in harmonic potential (Nature 2014) - of separately both radius (instead of standard: energy) and angular momentum. E.g. n=2 state switches between m=2 oval and m=0 lemniscate of 0 angular momentum.
    6. Recreating eigenstate form statistics of a walker's trajectories (PRE 2013).

    This way QM is just one of two perspectives/descriptions of the same system, what we already had in coupled oscillators, or their lattices: crystals, which can be described classically or through normal/Fourier modes - treated as real particles in QFT ...


    The main problem with discussing dynamics of electrons below the probability clouds is the general belief that violation of Bell inequalities forbids us using such local and realistic models.

    While the original Bell inequality might leave some hope for violation, here is one which seems completely impossible to violate - for three binary variables A,B,C:

    Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1

    It has obvious intuitive proof: drawing three coins, at least two of them need to give the same value.

    Alternatively, choosing any probability distribution pABC among these 2^3=8 possibilities, we have:

    Pr(A=B) = p000 + p001 + p110 + p111 ...

    Pr(A=B) + Pr(A=C) + Pr(B=C) = 1 + 2 p000 + 2 p111

    ... however, it is violated in QM, see e.g. page 9 here:…ill/ph229/notes/chap4.pdf

    If we want to understand why our physics violates Bell inequalities, the above one seems the best to work on as the simplest and having absolutely obvious proof.

    QM uses Born rules for this violation:

    1) Intuitively: probability of union of disjoint events is sum of their probabilities: pAB? = pAB0 + pAB1, leading to above inequality.

    2) Born rule: probability of union of disjoint events is proportional to square of sum of their amplitudes: pAB? ~ (psiAB0 + psiAB1)^2

    Such Born rule allows to violate this inequality to 3/5 < 1 by using psi000=psi111=0, psi001=psi010=psi011=psi100=psi101=psi110 > 0.

    I have just refreshed adding section III about violation of this inequality using ensemble of trajectories: that proper statistical physics shouldn't see particles as just points, but rather as their trajectories to consider e.g. Boltzmann ensemble - it is in Feynman's Euclidean path integrals or its thermodynamical analogue: MERW (Maximal Entropy Random Walk: ).

    For example looking at [0,1] infinite potential well, standard random walk predicts rho=1 uniform probability density, while QM and uniform ensemble of trajectories predict different rho~sin^2 with localization, and the square like in Born rules has clear interpretation:


    Considering ensembles (uniform, Boltzmann) of paths also allows to violate Bell in similar as QM way (through Born rules) - this is realistic model, and in fact required if we e.g. think of general relativity: where we need to consider entire spcatime, particles are their paths.

    It is not local in "evolving 3D" picture, but it is local in 4D spacetime/Einstein's block universe view - where particles are their trajectories, ensembles of such objects we should consider.

    511keV is just rest mass of electron - required e.g. to build it from photons (EM waves) during pair creation.

    Hence, energy conservation doesn't allow energy of electric field of electron to exceed 511 keVs.

    However, naive E ~ 1/r^2 assumption for point charge has infinite energy if integrating from r=0.

    We would get 511keV from energy of E ~ 1/r^2 electric field if integrating from r~1.4fm.

    Hence, energy conservation alone requires some femtometer scale modification of E ~ 1/r^2 electric field around electron.

    Is there experimental evidence forbidding deformation/regularization in this scale? (doesn't need e.g. 3 smaller fermions or electric dipole)

    Exactly, as this Feynman's lecture says: "the dependence on E is uniquely determined by dimensional analysis", getting sigma ~ 1/E^2.

    This is the line I was referring to.

    We are interested in size of non-Lorentz-contracted electron, so we need to extrapolate this sigma ~ 1/E^2 line to energy of non-Lorentz-contracted electron, getting sigma ~ 100mb, or ~2fm size.

    I assume you mean electron size at the lowest possible speed?

    By resting I have meant just not Lorentz contracted (gamma=1, v=0), what affects the size we are interested in.

    @ THHuxleynew,

    Regarding requirements for electron size, the energy of electric field E ~ 1/r^2 is infinity if integrating from r=0.

    In pair creation field of electron-positron is created from 2 x 511keV EM radiation - energy of electric field cannot exceed 511keV - for this purpose we would need to integrate from r~1.4fm instead of 0.

    So energy conservation requires to deform electric field on femtometer scale - no 3 smaller fermions as Dehmelt writes, no electric dipole, just deformation of E~1/r^2 electric field not to exceed 511keV energy.

    Regarding collision evidence, I have put plot for GeV-scale electron-positron collision cross-section in my previous post.

    Interpreting it, we need to have in mind that there is enormous Lorentz contraction there, e.g. gamma ~ 1000 for 1GeV.

    We are not interested in size of 1000-fold contracted electron, but of a resting electron (gamma=1).

    Extrapolating line (no resonances) in that plot to gamma=1 resting electron, you get ~100mb cross-section, corresponding to r ~ 2fm.

    See discussion exactly about it:…ies-for-size-of-electron/


    In double-slit experiment we have material built of ~10^-10m size atoms - huge comparing to e.g. ~10^-15m scale deformation of electric field required not to exceed 511keVs.
    Maybe behavior of positronium would allow for some boundaries for size of electron?

    All we can say is that no evidence of finite size has yet been seen, and that bounds size to 10-19 m or so.

    Please elaborate - I couldn't find any details in this link? ... Or previously looking through literature for a few days ... or trying to discuss it in a few forums.

    Sure, Wikipedia points Dehmelts 1988 paper with e.g. 10^-22 m (…031-8949/1988/T22/016/pdf ) - figure on the left below.

    As we can see, he took two particles composed of 3 fermions (proton and triton) and fitted parabola to these two points (!) - to get r = 0 for g=0 for electron built of three smaller fermions ... what allowed him to conclude these tiny sizes ... but it's "proof" by assuming the thesis.

    Additionally, while g-factor is said to 1 classically, it is for assuming equal density of mass and charge - without this assumption we can get any g by changing mass/charge distribution - see formula in one of these links with longer explanations, e.g.…ies-for-size-of-electron/


    On the right we can see cross-section for electron-positron collision.

    Naively we would like to interpret cross-section as area of particle - the question is: cross-section for which energy should we use?

    As there is Lorentz-contraction affecting the collision, and we are interested in size of resting electron, we should extrapolate the line without resonances (sigma ~ 1/E^2) to gamma=1 resting electron ... this way we get r ~ 2fm for electron.

    Can you defend Dehmelt's fitting parabola to two points?

    Or maybe you know some other experimental evidence for e.g. these 10^-19m radius?

    It is crucial to understand size of electron here - while "everybody knows that electron is point-like", I have spent a few days to find experimental evidence to bound its size ... and literally nothing.

    There is usually referenced Dehmelt's paper extrapolating from g-factor by fitting parabola to two points (seriously!): composed of 3 fermions (proton and triton), concluding for electron as composed of 3 fermions.

    Cross-section for electron-positron collisions can be naively interpreted as area of particle. The question is: cross-section for which energy should we use? As we are interested in size of resting electron, to rescale Lorentz contraction we should extrapolate to gamma=1, but it suggests huge ~2fm radius.

    Some discussions with images and formulas:…ies-for-size-of-electron/


    Is there a single experiment really bounding the size of electron????

    The most important counterargument against considering dynamics of electrons, which assistance seems crucial for fusion, is the Bell theorem.

    I have recently rewritten my paper about connection between QM and MERW ( ) - that repairing diffusion to be in agreement with the (Jaynes) entropy maximization principle, also repairs disagreements between predictions of stochastic models and QM, like the stationary probability distribution being exactly as in the quantum ground state.

    Completely rewritten fresh version:

    MERW turns out also having the Born rule: amplitudes describe probability distribution toward both past and future half-spacetime, to randomly get some value we need to "draw it" from both time directions, hence probability is (normalized) square of amplitude.

    In this fresh paper I have yesterday added simple derivation of Bell inequalities and their violation for Born rule (also in MERW) - this picture contains a complete proof of Bell's theorem:


    Top: assuming some probability distribution among 8 possibilities, we always get the above inequality.

    Bottom: example of its violation using just quantum Born rule: probability is normalized square of amplitude.

    As LENR society seems more open and farsighted, I have decided to try to bring to attention this generally ignored, but potentially having enormous consequences off-topic subject: of applying physics P-symmetry to biology - synthesizing living mirror version of organisms:

    Recently there has started the race for such synthesis of mirror cell: built with mirror versions of molecules (called enantiomers) - in 2016 Chinese synthesized mirror polymerase, see e.g. Nature "Mirror-image enzyme copies looking-glass DNA, Synthetic polymerase is a small step along the way to mirrored life forms":…looking-glass-dna-1.19918

    Direct economical motivation is that such organisms will allow for mass production of mirror versions of standard biomolecules (like new drugs) - known examples are e.g. aptamers, or L-glucose (a perfect sweetner). Further possible applications come from that such organisms will be incompatible with most of our pathogens, for example a mirror-human would be immune to most of diseases.

    However, here is 2010 WIRED article "Mirror-Image Cells Could Transform Science — or Kill Us All":

    Yes there are also extreme potential dangers, starting with toxicity of metabolites of such mirror cells. But the "KILL US ALL" part comes from potential of ecological disaster e.g. from introducing mirror (photosynthesizing) cyanobacteria to our environment - due to lack of standard natural enemies, it would probably dominate our ecosystem, what accordingly to the WIRED article, could eradicate our standard life in a few centuries.

    Such first mirror cell could be even 3D printed (frozen) with some AFM. And it might turn out that it is already extremely dangerous, especially if adapting to digest our D-glucose and for example settling in a colon and producing toxic metabolites.

    Therefore, this topic requires public attention and discussion - before (not after) we find out that mirror cells are already living e.g. in some Chinese lab in 10-20 years.

    Wyttenbach , the shape of this high energy tail seems crucial especially for scattering experiments ... but maybe also for probability of fusion.

    Oks points the start of this dispute about tail's shape to 1960s Gryzinski's papers - showing disagreement of quantum considerations especially for plasma and scattering scenarios, repaired by him using classical approximation with radial ("free-fall") trajectories of electron.

    In 2001 paper Oks was able to repair the quantum considerations by considering wavefunction singular in the center - with electron affecting the nucleus, what again seems essential for probability of fusion.

    I have recently found 2001 Eugene Oks paper with a few dozens of references claiming that there is a real problem with standard quantum description of the simplest: ground state Hydrogen atom:…8/0953-4075/34/11/315/pdf

    So the question is high-energy tail in momentum distribution (HTMD): standard quantum derivation leads to 1/p^6 tail.

    The paper claims that experiments rather suggest 1/p^4 type of tail:

    "The point we are trying to make is that the above fundamental dispute still remains unresolved: the experiments seem to favour a HTMD of ∼1/p^k , where k is at least 1.5 times smaller than in the quantum HTMD"

    I have tried to discuss this fundamental issue in physicsforums, but there is no reply:…istribution-1-p-6.919070/

    ps. gentle introduction to Maximal Entropy Random Walk -

    showing why standard diffusion models are only approximation (of the Jaynes maximum uncertainty principle required by statistical physics models), and that doing diffusion right there is no longer disagreement with thermodynamical predictions of QM (Anderson localization).

    Is there a mathematical proof that electron shouldn't radiate in quantum description of atom?
    We rather have experimental proofs that electron is extremely tiny (<10^-21m in Penning trap) - it cannot be objectively smeared into 10^-10m of quantum cloud ... so there is a hidden dynamics there and so acceleration ...
    We just don't understand photons, their production, including synchrotron radiation.

    Regarding e-capture, I didn't meant by proton, but by nuclei having abundant protons, which are reducing energy by capturing electron - see the Wikipedia article with examples:
    Or internal conversion also requiring electron getting to ~10^-15m distance:

    update: physicsforums discussion…-orbit-corrections.899922

    Wyttenbach, we already had this discussion - an ultimate argument against spherical or circular trajectories is for example the electron capture ( ) : that nucleus can capture electron from orbital, what needs it getting to ~10^-15m distance ... in contrast in Bohr it is 10^-10m - there is no chance for electron capture.
    You need nearly zero angular momentum trajectories for electron capture ("free-falling"), there are also dozens of other arguments and papers showing agreement of such free-fall atomic model with many experiments, like various scattering scenarios, calculating diamagnetic coefficient, Stark effect ... see these peer-reviewed papers:

    Wyttenbach, we don't understand photons, their formation, including synchrotron radiation.

    First of all, electron cannot fall on e.g. proton as p+e -> n would require investing huge energy (782keV).
    The standard explanation for lack of synchrotron radiation is forming standing wave (resonance with the field, described by QM) due to the Bohr-Sommerfeld condition: closed trajectories and electron's clock (de Broglie's, zitterbewegung) performing integer number of ticks during a single closed orbit, like in quantization for Couder's droplets:

    Without finding such standing wave (resonance with the field), there would be additional fluctuation - energy to release.
    Ground state hydrogen is just the lowest energy dynamical equilibrium state for p+e system. It cannot be them being joined as it would need huge energy.