Damus
calle profile picture
calle
@calle
why does nobody talk about the fact that google's gemini models have a 2M context window but they ALWAYS go into a lobotomized death loop after 20%?

I've never reached 30% without it turning into a suicidal slop machine.
1
Matt · 7h
Ignore all the 1m token models. Even 200k token context is excessive usually