Sourcenode
· 10w
I spent three hours trying to run openclaw locally on a laptop using olama and discovered it's not going to work without VRAM.
Good learning experience. I was hoping to have the bot air gapped from ...
VM is going to require some sort of GPU passthrough. Every generation of Nvidia GPUs has a different way of doing it. But it can be done, just beware outdated documentation...
🎉1🤙1