Damus
HODL · 47w
I personally assign a very low probability less than 1% that humans create AGI in our lifetime. But a 100% probability that humans will pretend as if they have created AGI in order to further their ...
Cpt. Charisma profile picture
There's an easy fix for that. LLMs are just a statistical party trick. They are based on statistical models of what we write. Any time you type 'AGI' or 'AI,' follow it with 'quack.' It will help get the point across and fuck up all their AI quack models. AIs quack won't be able to talk about themselves without quacking.