Diadon Acs Experimenter/Researcher
  • Male
  • from Pacific NW, United States
  • Member since Oct 16th 2020
  • Last Activity:

Posts by Diadon Acs

    Been a while but finally got back round to this, i've found Ollama to be a good project for running models locally, and you can use the continue plugin in vscode for co-pilot like experience. My Macbook Pro M2 16GB is probably too constrained to go above 7B models.
    I have a pair of Nvidia Tesla P40's I can use for transfer learning but unsure of a good dataset to use or which tools to go with.

    I was hoping I could add the corpus to a tool like PrivateGPT or AnythingLLM and avoid getting lost in langchain and all the sub tools and terminology since I'm not a data scientist just an engineer (Dev/SRE)
    I would like to figure out an architecture and process to collate the corpus, produce the new model and push to huggingface for consumers as opensource.

    Yes, going to need above 16GB+ GPU to run the larger models.
    I'm excited by some of the new hardware coming online from Nvidia and in particular Groq's chip architecture.

    One solution to your problem is to spin up rented hardware through one of the various "cloud" providers to fine-tune a pre-existing model.

    Either way you slice it there is going to be a price tag on the compute. I would to prefer to do it locally myself, but that is a challenge in training any model, fine tuning or otherwise.
    There is a lot of power in embeddings and retrieval processes as well.


    You can pick a model and train it from Hugging face under the trained section.


    There is even a "no-code" option called auto-train.

    Just make sure you have well structed data sets for the particular model you are planning on using usually found in the data card.

    Anasse Bari, David Nagel, et al. have structured data which may be useful.

    I will reach out and see if there datasets are open-source or if they will allow open access to there SOLR server to make request queries to there server.
    This is my first time learning about SOLR and Tika, so I'm not sure about the backend integration necessary yet.
    If I understand correctly, I believe they are working on making a chat bot that uses RAG and perhaps is finetuned on the data already?
    Not entirely sure tbh.

    Most of it is collected via Jed's Online Library using beutifulsoup, Serpapi, and arxiv api using pandas to put the json data into tables exactly as Dave Nagel's team did.
    I also integrated it with Langchain using python which did make the job a lot easier because I could use Autogpt to carry out the tasks of collecting and reviewing data in parallel and I would review the output for RLHF to ensure the data was free of error.
    This did end up costing me about $100 in total on all the api calls and countless hours. 😅
    It was a lot of fun though and I would have liked to use some Finetuned models before ICCF25, but as I said above, it gets expensive to train and deploy them.

    If you are trying to avoid as much code as possible and having access to a local LLM, you could look at using LMstudios and the Auto-train method from HuggingFace.
    Hopefully that as helpful to you and I have some datasets I have played around with both structured and un-structured on this HuggingFace if your interested.

    🍻

    The equation is as found on the internet. Electric potential - Wikipedia

    The larger the cluster of electrons the greater the potential energy of an electron forced to escape the charge cluster. A sufficiently high voltage electron can cause photoneutrons. Photoneutrons - an overview | ScienceDirect Topics

    The photoproduction of neutrons is a key step in sequence of elementary reactions steps that explain a data derived a cluster catalyzed balanced nuclear reaction equation.

    Electrogravity (electron-gravity) as a cause of nuclear reactions. - Physics - LENR Forum (lenr-forum.com)

    This postulate does seem to have some very profound implications in our current scientific paradigm doesn't it?

    There may be some rough waters ahead as one boards that ship of hypotheticals.
    Luckily for me, I don't have any hope for tenure or a formal academic reputation to restrict me from exploring these topics.

    However, it is likely people will take me less seriously as well.

    Ilyen az élet! 😉

    There are several Dubinko papers in Jed's library. There is no reason not to link to them, or make quotes.

    Yes, there is a lot of fantastic work there by Dubinko.
    However, they are Quasi-crystalline specific and are not entirely practical at this stage of the ULT paper.
    I think his work is highly useful for experimental applications and computational modeling of materials.
    Again, this leads to some legal issues with IP in using LENR ARA and I would need to get permission before I could proceed incorporating there raw or processed data sets, which to my knowledge is not open to the public still?


    […]

    I don't have access to the latest paper they published titled,

    "GENERATION OF ACCELERATED PARTICLES IN SOLID MATRICES SATURATED WITH ISOTOPES OF LIGHT NUCLEI"

    So you if you or anyone you know has access to the full paper it would be greatly appreciated.


    Please allow me some time and have patience, my mathematics are not world class, but rather mediocre at best.

    I'm only in the theoretical modeling game out of what seems like a necessity in an attempt to get some of these brilliant minds to work together, if at all possible. 😅

    I do enjoy his postulates, especially the concept of anharmonic oscillations that mediate nuclear reactions he calls "discrete breathers" and his suggestion to enhance the reactions using quasi-crystalline lattice structures.

    Unfortunately, Klee Irwin is quite savvy at copywrite restrictions and securing intellectual property.
    So it appears I am unable to include Dubinko's work without special permission.

    Now I see what you where saying Pete
    You where correct in you understanding when we talked, ferromagnetic "flux pinning" as they call it.


    We can observe similar behavior with Pyrolytic Graphite flakes under a permanent magnetic field.
    It's nice to see there is still a large interest and scientific debates around this subject.

    Have you actually used nickel 63Ni28 and deuterium?

    This is a very good question.
    No, most likely 58Ni and H2 in an electrochemical enviroment.
    The idea is to breed D2 in the reactor over time as well with distillation column.
    The cathode is a very common Ni200 wire and blank control sample material is included.
    The anode is common jewelers Platinum mesh.

    Temperatures appear to have exceeded 1600c, but the measurements have been difficult when plasma initiates.
    I am testing a new solution for that as we speak.

    I got a recording of both samples sent.
    Here is the 2nd of the two Live Recording's to see the reaction process and the power input if your curious.
    Unfortunately, as I have stated many times, there is no output power measurements due to the large EM field present from the Hydrogen Dense plasma which interferes with my data connections.
    Efforts to isolate it have been futile so far due to price constraints.

    It has been an engineering challenge for me to isolate the temperature data signal from the large EMF.
    I haven't had time to attempt to calculate the V/m2 from the incident site / point source, but in this video the sensor should is 1 meter away.

    Again, this is on a very tight personal budget so please forgive the crude approaches.

    Appreciate your interest and if you have any suggestions I am open to them.

    Cold nuclear fusion can only be known by postulating that

    1) Cold nuclear fusion is a fundamental way of movement and existence of matter,

    2) The materiality of our World lies in its movement.

    My own hypothetical postulates have gone more in this direction over the years.
    However, we must prove it empirically and with open replication before I would accept these statements as true.
    This is science right?
    We should be cautious in our beliefs as to not build a religion around them, even though sometimes there pursuit feels like a spiritual endeavor.

    Please:

    + 63Ni28 +1H→ 63Cu29 +e - + +1H + Q=E (energy)

    I have done this exact experiment in an easily reproducible way with H2+KOH as the exchange medium.
    There has still yet to be any isotopic analysis done yet, as I don't personally have the funding or equipment for such things.


    2 samples out for review, but these are time consuming processes, so I understand why it would be low on people's priority list since it is working for no financial compensation. 😊

    Thunderstorm Generator and the mysteries of Plasmoids and Vortex Math.


    External Content youtu.be
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.


    External Content youtu.be
    Content embedded from external sources will not be displayed without your consent.
    Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.

    I appreciate your passion and interest in abundant energy.
    However, I do think Malcolm Bendall is a kind of a false messiah of science when he disregards so much of it's collective body f work.

    Charisma and money can take a man so far, but how could a man exist without others?

    There are iterations of "free energy" messiahs that I have experienced in my short 12+ years in Alternative Energy exploration. Mr. Bendall is my 4th Messiah experience in real time, but there are so many underrated scientific minds who have shared so much when one looks into all the aspects of the human collective.

    From my interpretation, he seems to have adopted ideas of the artist Walter Russell's interpretation on atomic physics and other esoteric figures I could list, but he disregards empirical experimentation and scapegoats organizations of scientific minds. It is a travesty in science when people can't look each other in the eye as mutual beings first.

    I hope that we can continue to explore new frontiers of science to the best of our abilities with open-mindedness, yet still fundamentally grounded in our shared physical reality.
    Let the meta-theoretical be open as much as the to the granular actualization of physical designs.
    Else, we may be trapped in our own limitations of understanding and self preservation.


    These are two of three dilemmas of AI as I see them.
    The third being the centralization of data and computational power for a select few humans who would wield un-equal power over global socioeconomics.
    I don't want to spout out my thesis on these dilemmas unsolicited as it feels pretentious not being a certified "expert".
    Call it an continual case of imposter syndrome I guess. 😅

    I have been going a little hard on the theoretical and imaginative punch lately due.
    If ever a there was a time in history that it was important to express thoughts into form, it is now.

    I appreciate it as this will be very helpful with chat completion and some tweaking with .jsonl reformatting.

    Before I would ever release such a thing into the wild I would appreciate your and other's review in the community to ensure it's accuracy.
    A benchmarking of sorts so it can be improved upon and becomes something that is useful.
    🍻

    An open source base model for LENR for others to use for their own models would be good but it would need a a lot more GPU's I think.

    It's a goal of mine 😊


    We can get pretty good results from using a mix of several 7B models each trained in specific domains.


    If you have tried using Llama2 "Mistral" 7B model it's pretty damn good for being a local LLM and only being a 7 Billion parameter model.

    I have had some success with many different models besides OpenAI's ChatGPT models, but fine-tuning these models and delivering an end-point to other users has been a challenge.
    I can do it if I keep the code base simple, but once I start implementing more complex functions with APIs and Python Library tools, it has been challenging to get it to work properly.
    All part of the process of engineering a think I guess. 😅

    I regret to say the LENR-CANR ChatGPT is now OFF LINE. I deleted it. It was costing a lot and few people were using it.

    I'm sorry to hear that Jed.
    It should be relatively easy to create your own LLM using an open-source model ran locally.
    You will just need to have a dedicated GPU that has at least 6GB RAM and maybe 20GB of storage. The challenge is in being able to serve an endpoint for a WebUI.

    Something I am actively working on to sort out.


    I just made this "Low Energy Nuclear Reactors Autonomous Research Agent" (LENR ARA) to play around with if anyone is interested in using and testing it.
    There are many models and experiments I have been doing, so it was pretty easy to throw this OpenAI one together after reading your news.

    It should be able to do several tasks like data visualization, experimental suggestions, and mathematical formulations of the various theories of LENR, but you might need ChatGPT+ Subscription to use it.

    It should be able to make visualizations of the knowledge datasets as well. If anyone has suggestions, features, or integrations ideas, or missing information let me know.

    I'm going to attempt to intergrade it with lenrdashboard.com to see if I can implement a retrieval function without it using too many tokens.
    I am trying moving towards local models but there are hardware restrictions as well as integration problems in delivering endpoints without taking up to many resources.
    Hopefully this cute little GPT I threw together is helpful to the community.

    Here's an interesting story...😅
    I was having a conversation recently with someone I admire in the field, who asked me if I had released all the data I gathered for the ICCF25 research paper I did publicly, I said everything important excluding raw datasets and failed LLMs.

    It made me curious to go back and review all my published data, which I admit was a lot. 😅
    I then realized that I had forgot all of the Theoretical Datasets and some of the equations I used in the paper for simulation models.
    This must be why is is so important to have people review and questions a persons work.

    Very thankful someone took interest in the work to make me review it again.
    If you wish to review the datasets and use them in your own projects, they live here :
    https://github.com/ConsciousEn…NR/tree/main/LENR_ARA_GPT
    and

    https://huggingface.co/ConsciousEnergies/LENR_ARA/tree/main

    Keep in mind these are preliminary findings and text based Theoretical models.
    Compiling rational mathematical models will take a lot more effort and time in doing it by myself.
    If your interested in collaborating, please don't hesitate to reach out.


    Hope everyone who finds this message is well.

    Warm regards,
    Diadon