Any logical system is a self contained box of statements that stack on each other to build structures. Axioms of arethmitic build the structures of numbers, axioms that create our "cognitive bricks" build more complex idea structures.
AI takes these logical boxes and builds on top of them at a blazingly fast speed, which show their limits and edge points of rigidity. Any tower built has a high limit from its materials. If you want to build and more complex structures, you need to incorporate a variety of building materials (a variety of cognitive frameworks)
AI does the same, it is building systems at a rapid rate. At the most basic/individual level, offloading our cognitive capacity is increasing rigidity by atrophying our cognition. At the collective level, massively offloading cognition will build a rigid and fragile company/organization.
Not to negate the use of AI as a tool. But the mistake of assuming a small amount of individuals utilizing a single tool will increasingly place pressure on these points of rigidity, opening up a cascade of consequences.
Where does this come from? Look to Godel's Proof of the Incompleteness theorems - he made a process to demonstrate that any logical system is incomplete, and he did so by building statements so fast that the logical system increasingly shows where it becomes rigid.
Fundamentally, this is telling me that AI will use our logic (of all types, whether mathematical or dialectice) against us, and place pressure on society, increasingly deminstrating its fragility points. That's something to sit with, and digest the implications.
I hope this at least makes a seed of sense. The whole idea can't be unpacked in a single note, so I'm pointing and inviting for further study.
https://www.youtube.com/watch?v=HeQX2HjkcNo