If you’re using Ollama to run local models, especially with your agents, do this to save yourself endless headaches 🧠:
Step 1. Open the Ollama app on your desktop
Step 2. Go to Settings
Step 3. Increase the Context Length so your local LLMs can actually remember your conversations.

Step 1. Open the Ollama app on your desktop
Step 2. Go to Settings
Step 3. Increase the Context Length so your local LLMs can actually remember your conversations.

2