Open in Damus
HoloKat
@holokat
1708248363
is Ollama CLI only? they don't have a desktop UI?
14
❤️
1
✔️
1
nostrich
· 107w
Correct. Very easy to use though. https://ollama.com/blog/run-llama2-uncensored-locally but tl;dr: `ollama run llama2-uncensored`
reya
· 107w
You can use https://ollama-gui.vercel.app/
florian
· 107w
https://lmstudio.ai/ is a good alternative with UI to use different models locally.
captjack 🏴☠️✨💜
· 107w
python cli
stupleb
· 107w
https://image.nostr.build/a50ae6ebae03f4820cd10cd164fd10798da9316ec16c916f6cea897f9fb53749.jpg
4nt1m05h
· 107w
At least there’s a nice TUI: https://github.com/ggozad/oterm
talvasconcelos
· 107w
i think they have a UI plugin, and you can run a gradio interface
Tijl
· 107w
https://github.com/open-webui/open-webui