Damus
OKIN profile picture
OKIN
@OKIN
If you’re using Ollama to run local models, especially with your agents, do this to save yourself endless headaches 🧠:

Step 1. Open the Ollama app on your desktop

Step 2. Go to Settings

Step 3. Increase the Context Length so your local LLMs can actually remember your conversations.
2
Diego Valley · 3w
What size local model can you run?