RobertBryant Member
  • Male
  • 68
  • from Sydney,Australia
  • Member since May 10th 2015
  • Last Activity:

Posts by RobertBryant

    But ITER HF research can be justified on grounds of scientific & technological content



    What specific grounds and evidence justify the exorbitant hundred billion dollar ITER gravy train expenditure ???

    compared to LENR investment such as Industrial Heat's thousand $ expenditure on Hagelstein and Lu's latest research?


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

    PS - I'm also all for those (Hagelstein etc) who are trying to find LENR theories. That is a worthy endeavor too, and good science.


    I guess THHnew is not settled on whether Hagelstein's endeavor is good science or fringe science..

    judging by this cursory critique of Hagelstein's(and Siyuan Lu's latest 2018 Co-57 work)....

    (funded partially by Industrial Heat.. also patentappliedfor:)


    The second one( Hagelstein and Lu)

    is a failure to discover data that would validate a novel theoretical mechanism,

    together with some slightly (low level) anomalous data that was not previously predicted,

    and for which a few candidate explanations are given.

    However the data is weak, the explanations are speculative,

    and other (also speculative) explanations for this type of small departure from an exact exponential might exist
    So I don't see the second as LENR: unless you count the primary negative LENR result.

    And its relatively weak publication is explained by the fact that it is a negative result on a fringe theory.

    THH. News about Woodford and Industrial Heat

    https://iscmns.org/2018/11/jcmnsv27/

    Observation of Non-exponential Decay in X-ray and γ Emission Lines from Co-57

    Florian Metzler, Peter Hagelstein and Siyuan Lu ,2018

    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    "This then became the beginning of the calculus which was to

    grow into the Standard Model (Weinberg, 1995).

    After the war, quantum electrodynamics was developed to explain

    two experiments one by Lamb and one by Kusch

    which found small deviations from non-relativistic quantum predictions.


    To do this they invented the single virtual photon,

    so they did not have to deal with potentials,
    which they did not understand

    This was a problem that went back to the 18th century that had to be resolved.

    The Standard Model followed a similarly damaged path

    using a linear relationship with a Lagrangian instead....."

    I've never liked whingers.

    Seems to be a personal comment unrelated to QED

    typical THH critique

    every single product of modern industrial society is immensely complicated by the standards of the past.

    The R20 could have been built with existing technology sixty years ago

    The first R20 derived commercial reactor will be a lot less complicated than the

    first nuclear reactor

    The world's first "commercial nuclear power station", Calder Hall at Windscale, England,

    was opened in 1956 with an initial capacity of 50 MW per reactor (200 MW total),

    The fact that 3KW/300 W input generator made by hand burnishing of mesh is only available

    in 2019 is a testimony to the intensity of the 30years of active suppression

    of LENR R&D.

    Imagine if the thermionic valve manufacturers had suppressed transistor R&D in the fifties.

    for thirty years

    also courtesy of John Wallace

    "

    Currently cold fusion replication

    now number in the thousands but the original discovery

    in 1989 (Fleischmann and Pons, 1989) was only one of a

    number different process which include fission from fracture

    (Carpinteri et al., 2015) and dust enhanced fission

    and fusion in microwave cavities (Egely, 2016), with recent

    work (Mizuno and Rothwell, 2019) removing some of

    the material road blocks to engineering application "



    https://castinganalysis.com/files/DoE_comment_rev1.pdf

    Jed tells us the values are recorded from an anemometer


    The power of the blower, VxI ,is used as a measure of flowspeed to get air massflowrate.


    Obviously one does not want to do anemometer traverses all day long every minute of the day

    These daylong ....week long reactor monitorings are like watching the paint dry.


    The VxI readings are correlated with the anemometer traverses

    ..and this correlation changes if there are slight modifications to the airbox//blower configuration.


    There is no magic physics involved... just fluid mechanics 101.. and an x-y correlation.
    The endless hooplah abut this matter reveals much about the peanut crowd ..

    Lets see what howls from the gallery this will generate.

    zero about




    The Standard Model origin (in letter to the DOE stalwarts)


    courtesy of John Wallace


    "This then became the beginning of the calculus which was to

    grow into the Standard Model (Weinberg, 1995).

    After the war, quantum electrodynamics was developed to explain

    two experiments one by Lamb and one by Kusch

    which found small deviations from non-relativistic quantum predictions.


    To do this they invented the single virtual photon,

    so they did not have to deal with potentials,
    which they did not understand

    This was a problem that went back to the 18th century that had to be resolved.

    The Standard Model followed a similarly damaged path

    using a linear relationship with a Lagrangian instead....."


    https://castinganalysis.com/files/DoE_comment_rev1.pdf

    The argument that point temperature measurements are a lousy way to estimate heat flow still remains for many reasons

    Air calorimetry:

    heatflowrate = massflowrate x deltaT x heat capacity

    If the mass flowrate is uniform across the flow as there is in the turbulent flow out of the blower..

    and if the temperatures are not varying wildly every minute

    then point measurements at one place ...the RTD in the outlet duct ..every 15 seconds are quite sufficient.


    to calculate delta T,



    Mizuno estimates the mass flowrate by an accurately calibrated blower fan calibration.

    I still think that this along with his other "issues" -- such as the heat reaching the blower -- are bullshit, intended only to muddy up the discussion


    Tondemonai.


    THHnew has a sacred and unpaid duty to ferret out significant errors

    Plaudits to THHnew.


    Btw THHnew .. have you calculated the surface temperature of a cooktop element of area 0.028 m2 putting out 488W

    by Stefan'sLaw using an emissivity of 0.8 or so


    https://www.engineeringtoolbox…-heat-transfer-d_431.html


    and do you still intimate that the velocity traverse method used in pipe ducts for the last eight decades or more

    for process monitoring and regulatory requirements

    can be in error by 20%?

    Air flow calorimeters are very difficult

    For the replication ..use air flow calorimetry for x thousand $

    Look for delta temperatures in in the airflow of 15 or 20 C for the active reactor.

    Compare these with the calibration reactor.


    I am sure GoogleX has nice water calorimeter they have spent x million $ to set up ..at UBC?

    Perhaps they could adapt this to R20 for another x million $?