Damus
Sourcenode · 7w
I spent three hours trying to run openclaw locally on a laptop using olama and discovered it's not going to work without VRAM. Good learning experience. I was hoping to have the bot air gapped from ...
ynniv profile picture
it's rare to have a computer "without vram" these days. you just need to wait for smaller models to get better. most recently, jan-v3-4b-base-instruct is surprisingly capable for a model that can run in 2 gb vram. the smallest mac mini with apple silicon should be capable of at least an 8B model, and this is half of that
2๐ŸŽ‰1
Rich Nost · 7w
Honestly, I always choose such weird mountain man Linux distributions, I tend to just seek out legacy hardware with low compatibility issues, like Intel Core laptops with maybe some simple baked-in GPU. AI is gonna force me to give a shit about hardware-acceleration support in Linux booooo
Sourcenode · 7w
Thanks for the tip I'll try to find a lighter model. I don't need it to be a genius necessarily. Just want something to play around with