HaloKat
· 2d
Me: let’s refactor
AI: done
Me: but this is barely refactored, you didnt do much
AI: you’re right, we can do better.
Getting lazy on me!
This is when you trash the context window as it has been contaminated.
Also if this is happening set your thinking level to high (not xhigh), as especially Codex has a tendency to be lazy at medium/low thinking for some reason. (Probably RL caused)
I have a simple benchmark, ask the model to create 100 image files. Then ask it to read one by one in parallel to its context window.
Medium thinking will say “I can’t load all of those”, high will happily do it