The church of SM physics

  • If the Spectra are background-subtracted everything you see may be the effect of a (single) unsmoothed background that will correlate with both. Smoothing the background can of course lead to artifacts from smoothed background noise.


    That would mean poor processing: to background-subtract you should take a number of runs of the background (with the same integration time as the live traces) and average them, so that background noise is low.


    Because, as many have said, interpreting these spectra is complex and can be done wrong quite easily I think it is helpful to have complete transparency: raw data published somewhere, and clear description of the processing done and methodology used to derive any processed data which a paper refers to. Good peer review would probably force much of this. Or an author who understood why it was important would do it anyway.

  • That would mean poor processing: to background-subtract you should take a number of runs of the background (with the same integration time as the live traces) and average them, so that background noise is low.

    Exactly this - averaging background - is not serious science. We here did used two different backgrounds (space angles) without and without experiment and we always use the maximum of both to discriminate a signal. Averaging gives you many more signals albeit some of them can be artifacts = random peaks.

  • Exactly this - averaging background - is not serious science. We here did used two different backgrounds (space angles) without and without experiment and we always use the maximum of both to discriminate a signal. Averaging gives you many more signals albeit some of them can be artifacts = random peaks.

    Background averaging is just fine if done properly. The random peaks you mention are exactly the sort of thing that averaging should reduce.


    There is something about how you are taking and processing your backgrounds and your other spectra that just isn't making sense. I think that there is something about your procedure you haven't explained and that is why people aren't understanding what you are saying.


    In particular, I don't understand the relationship between your backgrounds and the active fuel in your setup. Why do you say ... "long run backgrounds cannot be use to discriminate lines as possible bursts are hidden"? What bursts? Bursts of what?

  • If you do short measurement runs then these are prone to background peeks. Better you know them!

    I don't understand how this responds to any f the observations I just made. Nor do I understand how it fits with anything you have said earlier.


    I'm not even sure what you mean here by a background peak. Can you point out for me what you regard as a background peak in your Figure 2?

  • I don't understand how this responds to any f the observations I just made. Nor do I understand how it fits with anything you have said earlier.


    I'm not even sure what you mean here by a background peak. Can you point out for me what you regard as a background peak in your Figure 2?

    I'm not sure where it occurs, but the K40 peak is a normal natural background hump at 1460.8 keV. Depending on what one is doing, if after background subtraction there is still a residual lump there, then the background signal is still there.

  • . Can you point out for me what you regard as a background peak in your Figure 2?

    ? Or post the raw data

    BACKGROUND

    as pointed out in earlier posts by me AND IN THE PAPER.. not much can be seen in screen shots


    the analysis of the whole data set which is equivalent to >100,000 screenshots is necessary..

    Here more than 300 lines are active - high above the background - at the “same” time and some are overlapping or pretty close. So only an inspection of the histogram file finally can tell the truth.


    If the entity BruceH wants to pursue serious analysis rather than feigned? dilettante? interest

    t than I suggest it supply name rank serial no.on researchgate where the paper is also available..

    https://www.researchgate.net/publication/356972251_A_new_experimental_path_to_nucleosynthesis


    At first sight our measurements did look like chaos and only the painful work of going down to histogram/channel level allowed a useful interpretation.

    As a consequence we had to develop a new analyzing method that could deal with broad range/ large number (> 300) of different lines above background. Doing this manually is possible for a single spectrum and some key lines, but for hundreds (spectra & lines) we had to develop new software.

    Finally one year after the first break-through measurement everything was in place.

    In a highly active fuel up to 80 lines are more than a factor of two above background.

    The strongest lines more than 10 fold

  • Wyttenbach

    What do the numbers on the vertical axis mean in the spectra in Figures 1 and 2? Here, for instance, is Figure 1....



    I had initially thought that the numbers were event counts (over your 10-minute sampling time) but, as I explained earlier, the lack of Poisson noise indicates that this is not so. Now having looked at the Theremino software and its documentation I think that you must have had the "Integr. time" button pressed while you took the spectra. If this is true then the numbers do not represent actual counts. Instead, they represent quantities calculated from some sort of averaging procedure that is engaged when the "Integr. time" button is pressed.


    Do you know what the numbers mean?

  • Did you have the "Integr. time" button in the Theremino control dialog box pressed when you took the data for Figures 1 and 2?

    Wyttenbach

    I see that in response to my question you have inserted a laughing face icon.

    Does this mean that you intend not to answer my question?


    Figures 1 and 2 of your ResearchGate manuscript definitely do not show event counts along their vertical axes. If these really were counts, the Poisson noise would be a much larger contribution to the spectra than is seen. I would like to find out what is being portrayed here. Please take this opportunity to answer reasonably.

  • Once more this is a delta spectrum modified by the Theremino parameters for nearby bucket contributions.


    A real spectrum is a set of histogram values. In the tables I form the deltas from the buckets.


    The delta spectrum is here to show which regions contains teh most signal and that the reaction globally is looks stable over a range of temperatures. But not for certain isotopes what is the key value we won't show....

  • Once more this is a delta spectrum modified by the Theremino parameters for nearby bucket contributions.


    A real spectrum is a set of histogram values. In the tables I form the deltas from the buckets.

    Thank you for your answer. I find, however, that it does not clear up my confusion.


    I mentioned in a previous post (here) that the 2 spectra in your Figure 1 (derived from activity at 2 different temperatures) are too similar. They are so similar that they cannot be simply showing background-subtractd event counts, as you claim. Instead, I think that these are somehow showing averaged counts. If these were straightforward event counts, the probabilistic nature of radioactive decay means that the standard deviation around each bin count would be equal to the square root of the bin count itself. Instead the 2 spectra you show in Figure 1 ) differ only by fractions of a standard deviation at most energy levels. Modifications "... by the Theremino parameters for nearby bucket contributions" would not explain this near similarity between spectra from completely different samples.


    From experimenting with the Theremino software, I believe that the spectra you are showing in Figures 1 and 2 of your ResearchGate manuscript were gathered while a "progressive integration" algorithm was engaged (i.e., the "iIntegr. time" button was pressed. Is this so? If this algorithm is engaged while data are being taken, the values in both the data histograms and the screen image are not event counts.

  • As said you never did measure a spectrum or a background. It is never random....

    This is a technical feature of radioactive decay that you appear not to understand. I am surprised. Please ask your friends on this point, or consult a textbook.


    While the average rate of radioactive decay is a stable physical property of nuclei, the actual timing of decay events is a random process. One therefore expects the arrival of events to be a Poisson process and, for low event counts (of the type you show), Poisson noise will be dominant. This noise, however, becomes a smaller and smaller part of the signal as the counts in each bin increase. That is why long-duration samples are desirable -- they increase the bin counts and so reduce the relative impact of Poisson noise. If you are prevented from taking long-duration samples because your conditions are not stationary, then averaging of noncontiguous samples to reduce Poisson noise is appropriate, as THHuxelyNew pointed out.

  • Did you ever hear about the detection angle? How many decays will you need if you get a stable count for 1/1000 of the full space angle ? How random will this final number be?


    As said before: Do once go to lab first and learn how to do it....

  • see that in response to my question you have inserted a laughing face icon.

    || || ||

    Tiresome ..BruceH... Perhaps it has a reading problem

    "

    BACKGROUND

    as pointed out in earlier posts by me AND IN THE PAPER.. not much can be seen in screen shots


    the analysis of the whole data set which is equivalent to >100,000 screenshots is necessary..

    Here more than 300 lines are active - high above the background - at the “same” time and some are overlapping or pretty close. So only an inspection of the histogram file finally can tell the truth.


    If the entity BruceH wants to pursue serious analysis rather than feigned? dilettante? interest

    t than I suggest it supply name rank serial no.on researchgate where the paper is also available..

    https://www.researchgate.net/publication/356972251_A_new_experimental_path_to_nucleosynthesis


    At first sight our measurements did look like chaos and only the painful work of going down to histogram/channel level allowed a useful interpretation.

    As a consequence we had to develop a new analyzing method that could deal with broad range/ large number (> 300) of different lines above background. Doing this manually is possible for a single spectrum and some key lines, but for hundreds (spectra & lines) we had to develop new software.

    Finally one year after the first break-through measurement everything was in place.

    In a highly active fuel up to 80 lines are more than a factor of two above background.

    The strongest lines more than 10 fold

Subscribe to our newsletter

It's sent once a month, you can unsubscribe at anytime!

View archive of previous newsletters

* indicates required

Your email address will be used to send you email newsletters only. See our Privacy Policy for more information.

Our Partners

Supporting researchers for over 20 years
Want to Advertise or Sponsor LENR Forum?
CLICK HERE to contact us.