It should be relatively easy to create your own LLM using an open-source model ran locally.
You will just need to have a dedicated GPU that has at least 6GB RAM and maybe 20GB of storage. The challenge is in being able to serve an endpoint for a WebUI.Something I am actively working on to sort out.
Well, if you figure out how to do that, and you want a window to your bot at LENR-CANR.org, let me know.
I spent a few weeks converting the entire LENR-CANR.org library from Acrobat to a format the ChatBot can use. The bot is supposed to accept Acrobat files but it does not. There are various rules such as paragraphs should not be long or the bot will lose track. I converted the files to ASCII and wrote a program to fit the parameters the vendor suggested. The files are here if you want to download them:
https://lenr-canr.org/Collections/ChatBotFiles.zip
This is out of date. I added ~50 papers after converting these. If you want to use them, I can convert the recent ones and add them to this batch.