Damus
OKIN · 2w
Uhm… that would depend on your hardware since we’re not running the same. Best is to go to Ollama itself or to LM Studio and see what can actually run on the hardware you have, or ask an AI chatbot