Damus
Paul Sernine profile picture
Paul Sernine
@lupin
#asknostr
Does anybody have experience with #Ollama and its myriad of models? Iโ€˜m looking to run an #AI #chatbot locally and train it with additional data. What model(s) would you recommend? Do you have any general suggestions regarding my aim? Thanks in advance for your replies & help ๐Ÿ™๐Ÿผ
41โค๏ธ1๐Ÿค™1
Jake Woodhouse · 5w
Pls share what you find out Would love to build something similar
cyber striker · 5w
โ€œI help people recover lost Bitcoin using ethical hacking techniques and then professionally trade those assets to help rebuild their financial position.โ€
0x80085 ๐Ÿ†• · 5w
Depends on ur pc setup and what it can handle. Afaik you can run any model w ollama. I'd suggest try a lightweight one like Phi and go from there. I cant help w the data part tho, that depends on what u want to achieve.
Ape Mithrandir · 5w
Depends alot on the machine specs, VRAM, RAM, CPU, GPU etc. You can check model VRAM usage here: https://apxml.com/tools/vram-calculator Also people mostly run already trained models locally, since training a model from scratch requires far too much resources. You can however use tools like OpenWeb...
Ernst Haft · 5w
I don't know how experienced you are, but training a model with additional data is gonna cost you time and money. Maybe RAG System is what you are looking for. When you compare a llm whit a library, a rag system is a bookshelf with your own documents in front of the library. After a prompt, the syst...