Moon
· 1w
What options exist for one to host-self an AI model AND the associated compute for inference? What is required? What is the cost (generally / directionally)?
#AskNostr
Top of the line: several hundred GB of ram running locally. $20k-$100k of hardware before cost for beefing up electrical wiring setup at home.
Middle of the road: maybe a few grand on high end consumer grade GPUs
Shitty: 1 word per second or less and shit context but can run on a middle of the road PC
The "magic" of AI exists only at the top of the line running the latest large models. Model size and hardware costs started small and got big and I expect them to drop again dramatically over time and self hosting will get more practical, just as bitcoin mining has done (from usb block erupters, to massive 3 phase machines consuming 10kw+, to bitaxes running on 9v DC power). For now if you go middle of the road or shitty, the benefits of these tools will not be clear at all and your experience may make you conclude that AI is fake and gay when in reality it is immensely powerful.