Damus

Recent Notes

djsumdog profile picture
I mean ... yea that could work, or he could be a psycho, drug addict, NEET, gooner or narcissistic psycho. It could really go either way.
djsumdog profile picture
It's in Gentoo portage. I was able to compile it and it's dependencies before the video finished.

Searching through my archive I do have some old abw documents ... but they're in my backup of a CVS repo 🤣
djsumdog profile picture
I considered Porkbun, but they're based in the Pacific Northwest, their DNS is a wrapper around Cloudflare services and I saw some other serious customer service complains. So far Spaceship seems pretty good and has nearly the same rates
djsumdog profile picture
If you use IntelliJ, the Continue plugin can connect to a local Ollama instance. Openclaw says it supports MiniMax M2.1, and there are docs about how to use it from LM Studio, but the onboarding wizard doesn't seem to have an LM Studio option :bunthinking:

djsumdog profile picture
Alright, well I might as well give this a shot. I wonder if I can get my agent banned with the right system prompt. ... I haven't used Ollama in forever. What local models are people using these days? I've got 32GB of vram.

oh god ... well Openclaw looks terrifying ... and no Ollama support ... not sure if it supports local models at all! Don't want it blowing through my Github tokens that I get from work either ... :bunthinking: