The church of SM physics

  • someone independent runs the analysis or selects the sample data periods if feasible.

    I forwarded the report to Steve Sesselmann here in Sydney...he may have a few weeks or months to spare for free. not soon though.

    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    I doubt whether Irina or Alla are still in the research or gamma business...or whether they are into detailed stats. I think they did some gamma work with tungsten and deuterium back in the day...with love from Russia.

    Gamma Emission Evaluation in Tungsten Irradiated By Low Energy Deuterium Ions. 258

    https://www.lenr-canr.org/acrobat/ISCMNSproceeding.pdf

  • But if the background does fluctuate there could be a serious problem for your analysis because the background-subtraction process used when analyzing your test signals could fail. Any difference between the background you used for subtraction and the true background that was present during the experiment could appear as a false signal that would then go into your analysis system to be interpreted as gamma radiation emanating from your fuel.

    Of course you failed to read/understand the paper. It is written that we did use 2 different backgrounds a lower and a higher one. In fact we compared with the highest signal of both together...


    The findings published are breakthrough science! Thus we did it very carefully!

  • Of course you failed to read/understand the paper. It is written that we did use 2 different backgrounds a lower and a higher one. In fact we compared with the highest signal of both together...

    I am reading your ResearchGate manuscript and it is true that I am failing to understand some parts. It isn't an easy read! Partly because in some parts you describe some things very briefly, in an opaque manner, and without explanatory figures.


    I noticed that you have 2 different sets of background measurements: a "uni directional" one taken somewhere "behind" the experiment, and another one "in view" of the experiment (in the same position as during acquisition of spectra from the active fuel?). The first one has higher average count than the second. But it is unclear what you are doing with these 2 backgrounds.


    You say "All results here are double checked against two different types of background to avoid temporary transparency effects with false positive/negative signals." But I am unsure what this means. Are you saying that you used first one background and then the other for background subtraction and got the same results both times? And what are you now saying about "both together"? I don't understand because this is not clearly explained.


    Also, talking about things that I read in your manuscript but don't understand, what does this mean .... "Long run backgrounds cannot be use to discriminate lines as possible bursts are hidden." What bursts? Bursts of gamma activity from the active fuel pellets? How would this interfere with a background which should be measured when active fuel is not around at all?

  • The findings published are breakthrough science! Thus we did it very carefully!

    It is not enough to take care in your research, you must also convince people that you took care. That means full explanations of materials and methods.


    This manuscript is in need of revision. Its explanation of the experiment -- how the fuel is prepared, the physical configuration of the measurement equipment, how the backgrounds were taken (and why they don't show reported peaks after 10 minutes) -- is incomplete. Also, basic checks as to whether measured spectra and the peaks found in them are real or artefactual are missing. What happens when all the background subtraction procedures are carried out on a known signal? Do you get what you expect? Does the peak-finding algorithm find peaks in signals that don't really have peaks (i.e., in background subtracted background, as suggested by Paradigmnoia)? What happens if you let the peak-identifying algorithm loose on these data armed with a different set of predictions (i.e., surrogate predictions that are not based on any reality)? Will it find peaks anyway?


    The procedures here are under reported. More figures would help.

  • I did manually/check analyze all lines of the transmutation chain.


    You cannot conclude anything from a printed spectrum. Usually you show about 5..10 lines but here we have > 300!

    You cannot print 1500 lines (buckets) on a sheet of paper...


    For normal fuels we just check the magnetic lines and follow up lines. This is enough for seeing what happens.


    As said, I hope you know what signal noise 3:1 means....

  • Wyttenbach


    Showing 5...10 lines would be just fine! Maybe just show the region covering the 10 lowest lines you claim to locate in the table of Figure 3 in your manuscript (both background and signal). That would only need 500 bins or so. I have still not been able to spot most of these lines in your Figure 1. Nor have I seen the "most active" 234Th line you say is at 63.3 keV in the background. It is uncomfortable for the reader when claims in one part of a manuscript are not backed up by the evidence presented in other parts

  • I have still not been able to spot most of these lines in your Figure 1

    Bruce... a bit repetitive.?

    You can;t get much detail from a screenshot....which is Fig 1.

    in contrast the 'histogram file' is effectively thousands of screenshots

    and its the ' delta' that counts for "active"....vs BG

    .one needs..

    the histogram-results to pick out lines,frequencies.

    .not one screenshot of a spectrum with poor resolution

    as stated here in the text

    "here more than 300 lines are active....so only an inspection of the histogram file can finally tell the truth"

    let me repeat Only an inspection...in case you miss it again


    I looked at the gamma spectrum for Alla and Irina.. Gennady.. for De ion plus W


    they said the peak lines were 63. 8 kev and 50,63. and 73.6..kev

    there appear to be much more than a few lines...which is which?

    Here is their screenshot....what's happening at 57- 58Kev?

    any ideas from your BG?

    perhaps they used a crude histogram in 2007?

    certainly the analysis was not based on one screenshot.


    https://www.lenr-canr.org/acrobat/ISCMNSproceeding.pdf

  • .what's happening at 57- 58Kev?

    It could be Kalpha ..W has one near 57-58 Kev..

    in fact around the Ag- W range there are quite a few below 300 keV.

    perhaps these contribute to 'noise' or theymaybeinvolved with LENR?


    K alpha ev's

    Zr15774.914(54)
    Nb16615.16(33)
    Mo17479.372(10)
    Tc18367.2(12) *
    Ru19279.16(18)
    Rh20216.12(20)
    Pd21177.08(17)
    Ag22162.917(30)
    Cd23173.98(20)
    In24209.75(22)
    Sn25271.36(23)
    Sb26358.86(25)
    Te27472.57(27)
    I28612.32(49)
    Xe29778.78(10)
    Cs30973.13(46)
    Ba32193.262(70)
    La33442.12(27)
    Ce34720.00(29)
    Pr36026.71(31)
    Nd37360.739(70)
    Pm38725.11(72)
    Sm40118.481(60)
    Eu41542.63(41)
    Gd42996.72(44)
    Tb44482.75(47)
    Dy45998.94(51)
    Ho47547.10(77)
    Er49127.24(12)
    Tm50741.475(92)
    Yb52389.48(66)
    Lu54070.39(70)
    Hf55790.8(11) #
    Ta57533.2(16)
    W59318.847(50
  • I am still trying to puzzle out why the properties of the background spectrum shown in Figure 2 of Wyttenbach's ResearchGate manuscript are so different from those claimed in the text (i.e., claimed peaks are not visible).


    To address this Wyttenbach at one point says that that his plots don't correspond 1:1 to what is contained in the histogram files partly because ....

    The screen we used is about 1000 pixels for about 4000 channels .

    I have realized, however, that this argument is not relevant Figure 2. In the screenshot of Figure 2, one can actually count screen pixels. It turns out that there are 25 pixels per square. That means, sure enough, that there are just about 1000 pixels across the screen just as Wyttenbach says. But in Figure 2 Wyttenbach has chosen to show only part of the 4000 channels available. Instead, I calculate that the energy region 0 - 360 keV region shown contains only about 566 channels (channels appear to span 0.6 keV from 0 - 300 keV and 0.9 keV above that). This means that we are seeing almost 2 pixels per channel and so there is room to see a 1:1 portrayal of the values in the histogram files.


    So, I repeat, where are the background lines mentioned in the text? Is there extra smoothing applied to the histogram values before they are shown in Figure 2? Wyttenbach mentions peak shaping in the Theremino software but this is a step applied to the fluctuating voltage signal from the detector before being binned by the multichannel analyzer so it shouldn't affect the relationship between the information in the histogram files and what we are being shown. So is there another level of analysis then? If there is it should be described more carefully.


    It continues to be a concern that what Wyttenbach claims in the text part of his manuscript differs from the evidence he presents in his figures. It is a problem that could be partly alleviated by releasing the histogram files.

  • Wyttenbach


    In Section 2 of your ResearchGate manuscript, you say ...


    "We did use a bin resolution of 600 eV for the first 200keV what should be enough to classify the lines of interest. All spectra did run for 10 minutes.


    Here more than 300 lines are active - high above the background - at the “same” time and some are overlapping or pretty close."


    If by "Here" you mean the first 200 keV of the spectrum, I don't understand how this makes sense. If your resolution is 600 eV then you only have 333 channels available over the first 200 keV. So how do you empirically define 300 lines over these 333 channels? Is the 300 a predicted number only?

  • I continue to try and understand some of the results presented by Wyttenbach in his most recent ResearchGate posting (A new experimental path to nucleosynthesis)


    I have had trouble understanding the nature of the gamma spectra displayed in this work. In particular, in thinking about things, I don't understand why the noise in some spectra seems so low. Figure 1, for instance, shows background-subtracted spectra at 250C (top, below) and 380C (bottom, below). Both spectra are taken over 10 minutes.



    What we see here is a remarkable similarity between the 2 spectra. They are not identical -- after all they are taken at different times and different temperatures -- but for the most part the bumps and valleys in the top spectrum also occur in the bottom one. The similarity does not just consist in the 2 spectra showing mainly the same peaks (which is OK), but in the peaks being the same height n both spectra. Indeed, if you superimpose these traces they fit over top of each other very closely at many many points.


    My puzzlement here is that the event count in each bin of a spectrum should be roughly Poisson distributed (due to the random timing of gamma events). Following from this, the standard deviation of counts in each bin should be about the square root of the count number itself. Thus if we were to acquire the same spectrum again (over a similar interval) the count number in each bin should differ slightly from the first values. And, if we acquired and reacquired the same spectrum over and over, we should find that the count in any bin should vary with the square root of the mean count as just noted.


    Now, look at the red vertical line in the top spectrum (at about 49 keV). It happens to lie near the peak of a feature that is also present in the lower spectrum. In the top spectrum, the bin count at 49 keV is (estimating by eye) 19. In the bottom spectrum the bin count is about 18. These count values are close. Too close by my reckoning! I calculate that the standard deviation of counts at this energy should be about 5.5 (because total counts at this energy [background + signal] is 30 counts and the square root of 30 is about 5.5). In light of this, the 1-count difference between the upper and lower spectra at this energy is statistically unlikely.


    You can repeat this observation for any energy shown in Figure 1 and get the same result. The difference between the 2 spectra at most energies is many times smaller than expected. So this is not just a statistical fluke that happens to occur at 49 keV. It is at all energies. And I don't understand why this is. Wyttenbach says a (undescribed) smoothing algorithm has been used before displaying the spectra, but it is not immediately apparent to me how this could account for such a radical reduction in variation between them.


    Does anyone know what might be going on?

  • My puzzlement here is that the event count in each bin of a spectrum should be roughly Poisson distributed (due to the random timing of gamma events).

    Typical beginners error. A stable physical process delivers a stable gamma count. Why do you believe we can measure half lives??

    10keV = 17 bins and a peek displayed as such in a spectrum spawn at least three bins because the software does so. You just waste time with an interpolated curve that hides the reality...

    I give detailed peek counts in the table including background....

  • Typical beginners error. A stable physical process delivers a stable gamma count. Why do you believe we can measure half lives??

    We differ on the most fundamental issues of the physics here.


    The radioactive decay processes I know about are probabilistic. To answer your question ... I believe we can measure half-lives as bulk phenomena, but on a microscopic scale the probabilistic nature of the process results in Poisson noise as I outlined.


    Thorium 234 has a half life of 24 days. Suppose you have 2 atoms of Thorium 234 sitting in a box. Are you somehow arguing that those 2 atoms will sit there until 24 days are up and then one of them will decay? That is where you seem to be heading.

  • To answer your question ... I believe we can measure half-lives as bulk phenomena, but on a microscopic scale the probabilistic nature of the process results in Poisson noise as I outlined.

    I doubt you ever entered a physics lab. Nor do I believe you have the slightest dust of how to use math in a lab....


    A gamma spectrum has nothing to do with noise. Only a part of background is real noise... A calibration source typically has some 1000bq.

  • A gamma spectrum has nothing to do with noise. Only a part of background is real noise... A calibration source typically has some 1000bq.

    A calibration source is 1000bq but your detector is seeing nothing like this rate of gamma events. The bin counts you report are only on the order of 0-40 counts per 10 minutes and are thus susceptible to Poisson noise.


    For the low number of counts you report, Poisson noise is a major factor in the signal. If the counts were higher, the impact of this type of noise would be much smaller. The signal-to-noise ratio for Poisson noise = sqrt(n) where n is the bin count. In other words, the signal-to-noise ratio improves as the square root of the number of counts per bin. See https://radio.astro.gla.ac.uk/old_OA_course/pw/qf3.pdf

  • A fourier analysis without any previous algorithmic filtering of the data would resolve this discrepancy. We have used FFT in biophysics to study single-channel voltage fluctuations in the past on an old state of the art (at that time (1980) PDP-11 computer system. The problem with modern computing is that the signal can become distorted by the algorithmic noise so leading to cherry picking and thus obtain spurious results. The answer I think is to compare FFT spectra on an active system in comparison to that of the control, which Wyttenbach has faithfully tried to transpose. :)