calle
· 9h
why does nobody talk about the fact that google's gemini models have a 2M context window but they ALWAYS go into a lobotomized death loop after 20%?
I've never reached 30% without it turning into a s...
Ignore all the 1m token models.
Even 200k token context is excessive usually