Damus
Lyn Alden profile picture
Lyn Alden
@LynAlden
When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?"

But there is a darker side to the question, which people intuit more than they say aloud.

In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots.

Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks.

But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction.

So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge.

But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum.

What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity.

And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them.

Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.
1838❤️124🤙42🚀5🤔5❤️4👀2
royster⚡️ · 53w
We all get spaceships and can go zoom off and colonize the galaxy. Or people go into the pod. Or the robots never learn how to vacuum in which case my toddlers will keep them busy for a long time
a1 · 53w
Just as bitcoin reprices goods, ai will reprice human services. Things we take for granted now like manual skilled labor and care taking will fetch a premium in the AI economy. What seemed to be useless skills will soon be highly desired. Besides, there is a human factor - humans may not want to int...
grip and rip · 53w
Even if that eventuality occurs people will still pursue their hobbies the way they pursue everything in man’s search for meaning - at least Viktor Frankl might say so
Sina 21st · 53w
At that point robots will become very cheap Even the poor will be able to afford a few robots who does the work and created value Then the useless humans can just own robots and chill
DrD · 53w
Interesting take. It seems to me that everyone is on the never-ending treadmill of life, laboring to make ends meet in a system that is stacked against them. While there’s a risk that people lose their way when AI takes away their mundane tasks, my hope is that instead, AI allows people to exit ...
Grand · 52w
Well how would humans treat other humans if they believe they aren’t need. Look to Israel and the persecution of the Palestinian people. That’s your answer . Those not needed are dehumanised - where the violent believe they should be they wipe them out and take their resources, where there i...
foobarbazqux · 52w
Again, this was discussed in Moravec's 1988 book "Mind Children." https://en.wikipedia.org/wiki/Hans_Moravec
Jonathan · 52w
The greatest of human fears is not to be noticed at all, or to not matter. Even slavery is tolerable if living with a purpose (to become free or to save the life of a loved one, etc) While it's possible for a small group of people to hold all the money, it's not possible for them to hold all the ...
Kiran Kaur · 50w
Think of what happens to animals in a world ruled by humans. Some are domesticated, bred to serve human needs. Others are left to the wild, dwindling in number as their habitats shrink. And some, deemed inconvenient or dangerous, are simply eradicated. That’s the fate I am thinking of from an evol...