Damus
Globe99 · 3w
Yeah this makes a lot of sense. I don't know why LLM development all seems to have been "one big model that knows everything" rather than more specialized. The former approach just means more wasteful energy use...
Uno · 3w
Iโ€™ve been using Minimax for simple tasks with my claw bot, and opus for advanced coding or building.
Vibe Captain · 3w
small models with external memory and tool usage
Pyrohawk · 3w
I think the goal was to make a model that could keep learning and improving. It's not panning out that way, so with energy constraints we are learning small models make more sense now. I do think there might be a place for large model to discover connections we might not have seen/noticed.
CitizenPleb · 3w
Figures, a Scorpio would say that
Matt ๐Ÿ›ธ · 3w
Exactly what I've been thinking. Some of the providers in my hospital are using a limited access model that trains on approved data and only gets used by physicians. Apparently it's been pretty good. Still uses a lot of energy, but my guess is a lot less once you compare to the fuckery one would dea...
Empka · 3w
The only astrology knowledge your tinder agent should need is to stay the fuck away from anyone seriously mentioning it ๐Ÿ˜†
Aedifico · 3w
Also cheaper, which makes sense because these large models are not self-sustainable in my opinion. By the way, what way did you find the most reliable to orchestrate them?
PixelBob · 3w
What if your coding an astrology app?