Damus
Paul Sernine · 5w
#asknostr Does anybody have experience with #Ollama and its myriad of models? I‘m looking to run an #AI #chatbot locally and train it with additional data. What model(s) would you recommend? Do you ...
Ape Mithrandir profile picture
Depends alot on the machine specs, VRAM, RAM, CPU, GPU etc.
You can check model VRAM usage here:
https://apxml.com/tools/vram-calculator

Also people mostly run already trained models locally, since training a model from scratch requires far too much resources. You can however use tools like OpenWebUI to create a chat frontend for Ollama and then add knowledge base to the chat to help direct the conversation with additional data.
1
Paul Sernine · 5w
Thanks for your reply & the info!