Damus
OKIN · 2w
I’ve run local LLMs using Ollama on a 2015 Dell Laptop with only 8GB of Ram too… was slow but got the job done & my information never left my local environment so I was happy as a pig in shit 😅 Always try to break things with the hardware you have before buying hardware you don’t have. Tha...