Volunteers needed to DeepDive into lenr (cheap none Edisonian science)

  • DeepDive like watson is a system to extract value from dark data. Like dark matter, dark data is the great mass of data buried in text, tables, figures, and images, which lacks structure and so is essentially unprocessable .DeepDive is used to extract sophisticated relationships between entities and make inferences about facts involving those entities.I believe that the scientific knowledge to prove, explain or disprove lenr is available on the web (patents,forums,papers sometimes unrelated to the field) and accessible but not readable because the information is scattered sometimes trolled in never ending debates or interpretations.DeepDive is a trained system that uses machine learning to cope with various forms of noise and imprecision. it might help this community to solve some of the hard problems in lenr (finding a theory/recipes/ applications ) i understand that not everyone here is capable experimenting for various reasons i know that a lot of people are impatiently waiting for recipes but i think that everyone both sceptics and believers can collaborate to set up teams to feed a DeepDive system and perhaps find answers . My opinion is that excess heat is not the only answer if we could use a cheap tool to understand the phenomena it my open many doors .
    i don't know if it will help answer to all the questions but i think machine learning will help narrow down the variables. a knowledge base build from deepdive can than be feed into other predictive algorithms to build models .


    you can find deedive here http://deepdive.stanford.edu/
    Examples of DeepDive applications :
    MEMEX - Supporting the fight against human trafficking, which was recently featured on Forbes and is now actively used by law enforcement agencies.
    PaleoDeepDive - A knowledge base for Paleobiologists with quality higher than human volunteers.
    GeoDeepDive - Extracting dark data from geology journal articles.
    Wisci - Enriching Wikipedia with structured data.


    External Content www.youtube.com
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.
    [/media]

  • DeepDive like watson is a system to extract value from dark data. Like dark matter, dark data is the great mass of data buried in text, tables, figures, and images, which lacks structure and so is essentially unprocessable .DeepDive is used to extract sophisticated relationships between entities and make inferences about facts involving those entities.I believe that the scientific knowledge to prove, explain or disprove lenr is available on the web (patents,forums,papers sometimes unrelated to the field) and accessible but not readable because the information is scattered sometimes trolled in never ending debates or interpretations.


    This idea is only partially sustainable. May be You could extract all paramaters of all experiments done so far and provide a nice overview.


    But the main caveat is any lack of a first order theory for nuclear physics. Thus everybody uses his own approximations, gauges, etc..


    This is the reason why such a try will end up in a pile of unsortable data which nobody can correlate. Up to date there is no published protocol, which is tight enough to allow a replication of one single experiment!!! mfp is trying to fill this gap. But still trying...

  • LENR is simple if we look at the cogent data. Look at the sub atomic particles that the LENR reaction produces. When we see that strange matter is produced in D mesons, there can be only one reason. Then we have our answer.

  • I agree with you that LENR is simple but like you said

    Quote

    Look at the sub atomic particles that the LENR reaction produces

    how many person here can show us those particles on their bench .
    You have been asking for sometime now for a cloud chamber experiment but i haven't seen anyone delivering something that might be a simple experiment.
    again i am making no claims but i believe that once people clean up the noises the information they seek will be clear and many unnecessary brute force steps will be avoided .
    the ongoing debates are fun/entertaining but a bit noisy.
    i guess for a lot of people it's a hobby so i understand .

  • I agree with you that LENR is simple but like you said

    how many person here can show us those particles on their bench .
    You have been asking for sometime now for a cloud chamber experiment but i haven't seen anyone delivering something that might be a simple experiment.
    again i am making no claims but i believe that once people clean up the noises the information they seek will be clear and many unnecessary brute force steps will be avoided .
    the ongoing debates are fun/entertaining but a bit noisy.
    i guess for a lot of people it's a hobby so i understand .



    No cleanup is required, What's required is the cleanout of the excess heat experimental paradigm. Follow the particles. This is where the road to LENR lies.

  • One of the biggest problems we face is a lack of standardization in the description of relationships, concepts, etc. In my brief Federal career I pushed for the development and use of Ontologies for science and engineering but -"deaf ears". However, the standarization of semantic meanings would go a long way toward identifying patterns of relations among data across experiments

    • Official Post

    ne of the biggest problems we face is a lack of standardization in the description of relationships, concepts, etc. In my brief Federal career I pushed for the development and use of Ontologies for science and engineering but -"deaf ears". However, the standarization of semantic meanings would go a long way toward identifying patterns of relations among data across experiments


    And (bless you) so would writing in plain English. ;)

  • And (bless you) so would writing in plain English. ;)


    I try my best to use the terms that I see commonly used in science. But instead of being encouraged in this effort to conform with the common language usage in science, I am lambasted for speaking in word salad.


    Clearly, each scientific specialty is like a foreign country with its own language and jargon. It takes a long time living in those far off lands to become familiar with the customs and languages used there. Translation to common parlance is very difficult. For example, explaining quantum optics in terms that a six year old can understand takes a world class communicator.

  • One of the biggest problems we face is a lack of standardization in the description of relationships, concepts, etc. In my brief Federal career I pushed for the development and use of Ontologies for science and engineering but -"deaf ears". However, the standarization of semantic meanings would go a long way toward identifying patterns of relations among data across experiments


    It's no longer a problem take a look at the latest NLP tools available . been finding unpublished scientific gems for a while now by creating a Semantic Fingerprint (Sparse Distributed Representations) . The fingerprints allow direct semantic comparison of the meanings of any two words,
    showing thousands of semantic relations. It's language and domain independent.


    " For example, explaining quantum optics in terms that a six year old can understand takes a world class communicator" we have built world class interpreters beside google sadly only few are using most of them.

  • Using a bit of machine learning I have been able to identify and synthesize a cheap plasmonic Photocatalyst for Hydrogen Dissociation and loading on nickel with over the counter 1 watt lasers and it's way more efficient than the kilowatts of heat that are being dumped by replicators .I can even place it in a cloud chamber .trying now to figure out what is a NAE.

  • Off course there is no secret i just used a short cut everything is already published on papers I will upload what I have . Pdf and Stl files . The photo catalyst i am using is aluminum nanoparticles but there are many other just a bit of tuning during synthesis is required .

Subscribe to our newsletter

It's sent once a month, you can unsubscribe at anytime!

View archive of previous newsletters

* indicates required

Your email address will be used to send you email newsletters only. See our Privacy Policy for more information.

Our Partners

Supporting researchers for over 20 years
Want to Advertise or Sponsor LENR Forum?
CLICK HERE to contact us.