When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?"
But there is a darker side to the question, which people intuit more than they say aloud.
In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots.
Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks.
But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction.
So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge.
But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum.
What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity.
And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them.
Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.
Login to reply
Replies (100)
Or 1% of humans get spaceships and zoom off, and the other 99% don't.
The future for those who hold capital, or for the future of 8+ billion people in general?
The robots can just keep making spaceships. I’m banking on at least one benevolent 1%-er who will hook the rest of humanity up. Also, most humans will be Amish by then so they’ll probably be fine just gardening.
Very well put and something I definitely worry about.
One thing I’ve pondered is if the bots end up in the hands of the few maybe those without the bots will go and create their own settlements of some kind and almost go backwards in time and work in small, barter like communities.
The question would come down to how they’d get the resources to do so. If desperate enough, perhaps they’d resort to stealing 🤷
People think AI will have a huge impact… but Bitcoin will have an even bigger one! Much bigger!
The argument raises important concerns about technological unemployment and social consequences but has several issues:
1. Technological Determinism Without Sociopolitical Context
The argument assumes that technological capabilities directly determine social and economic outcomes, neglecting how policy, laws, and social norms shape technology’s impact. Historically, labor displacement has led to new forms of employment, often because of societal adjustments (e.g., welfare systems, universal basic income proposals, or labor rights movements). It overlooks how governments and societies could adapt through redistribution or new economic models.
2. False Equivalence Between Economic Worth and Human Worth
The argument conflates economic productivity with human value. While it acknowledges a darker side of human behavior (loss of empathy when utility disappears), it implies that a world where many cannot “outperform” robots economically naturally leads to exclusion or extinction. Societies have often supported non-economically productive members (children, elderly, disabled individuals), which contradicts the assumption that lack of economic utility leads to dehumanization or elimination.
3. Linear and Singular View of Progress
The narrative suggests a linear progression: first the bottom 20%, then the next 20%, and so forth, as if economic value is a fixed spectrum and as if job displacement will be uniform and inevitable. In reality, new industries often emerge unpredictably, and technology can augment rather than replace human capabilities. Additionally, human labor markets are not purely meritocratic; social, political, and cultural factors influence value and employment.
4. Overestimation of Technology’s Autonomy, Underestimation of Human Creativity
While it criticizes technological hype (e.g., mobile AI limitations), it paradoxically falls into the same trap by assuming that once robots are capable, the consequences are inevitable. Historically, technological advances have often complemented human labor rather than replacing it wholesale (e.g., industrial revolution, information age). Human creativity, empathy, and adaptability often create new niches of value.
5. Oversimplification of Capital and Ownership Dynamics
The discussion of capital holders dominating through robot ownership oversimplifies economic power structures. Technologies often become decentralized over time, and monopolistic control can be disrupted by innovation, regulation, or collective action. Additionally, open-source movements, digital commons, and cooperative ownership models challenge the narrative of capital consolidation.
6. Fear-Based Speculation Without Exploring Positive Outcomes
The argument frames the future in a dystopian manner, emphasizing conflict and extinction but not considering counter-scenarios where technology leads to more leisure, better quality of life, or communal wealth-sharing mechanisms (e.g., universal basic income, worker co-ops using advanced tools, or post-scarcity economies). Fear is presented as the dominant driver of human action, but hope, empathy, and cooperation have historically shaped major social advancements.
Conclusion:
The core flaw is that the argument assumes a deterministic, zero-sum relationship between technology and human value, underemphasizing human adaptability, social policy, and the capacity for collective solutions. It taps into genuine anxieties but falls short of a complete, nuanced exploration of technological, social, and economic evolution.
Ok so these are good points but this reads like an answer from ChatGPT lol
You’re brilliant Lyn. Always enjoy your content ⚡️🙌
Thanks, chatgpt
They'll become space colonists
Isn’t it kinda wild how after a while you get a feel for it and it’s easy to pick up on the AI responses? They’re often formulaic.
If this is the stuff you think about daily,
I can’t wait to read your sci-fi!
I share this fear and I agree it’s often unarticulated. One somewhat optimistic counterpoint: humans need each other simply for good health and happiness. Thousands of years ago we needed tribes to survive, now we need tribes to thrive. If we thread the needle and find ourselves in a post-scarcity world where AI/robots produce our food and basic needs, we’ll still need and want each other. I’ll still grow my own food and share it with friends because it makes me happy. But it’s a hell of a delicate path to tread and unfortunately, a dystopian scenario such as yours seems one of the more likely outcomes.
At that point robots will become very cheap
Even the poor will be able to afford a few robots who does the work and created value
Then the useless humans can just own robots and chill
For sure. This commenter didn’t even bother trying to massage the output into a coherent human response. “The argument raises important concerns…” 🙃
Humans that want to work (and most do) will find a productive means to serve each other. We have moved away from specialized production and into Mass production. Humans can go back into local sourced goods and specialized forms of trade and IT work.
There is no end to human wants, and so there is no end to human labour.
Average people today enjoy luxuries reserved for elites in times gone by.
In the future when average people enjoy the luxuries of today's elites, what will happen? Will they all be satisfied and living leisurely lives?
No. They will come up with something new to want.
Why are you not on @Stacker News?
You're post are too long to consume on nostr... SN seems a better platform...
IMHO
When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?"
But there is a darker side to the question, which people intuit more than they say aloud.
In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots.
Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks.
But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction.
So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge.
But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum.
What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity.
And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them.
Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.
View quoted note →
Tl;dr
That was the joke…
Also, the fact that you dismiss something purely for not being human makes my point further. Humans will always value being heard, seen and worked on by other humans, at least for things like connection. In fact look at pets- people choose pretty useless pets from a utilitarian point of view. If it’s a few decades for robots to be as competent as humans, it’ll be many more until we care about them like we do about living creatures. A big part of why humans are irreplaceable is because they have suffered, not because they no longer have to (at least through physical labour).
#AI is never going to be cheaper than humans .. we still have 2 billions ready to work on less than a dollar a day .. and we work at 100 watts ..
#AI doesn't replace even the better paid workers .. because it is 1) dumb 2) unpredictable.. you can't fix hellucinations cuz no one knows how the model actually works .. fix one thing other breaks ..
The fear hype about AI is just to make it a buzzword ..which big tech seems to have succeeded ..
That said - AI is pretty cool tool for otherwise non productive and not critical apps .. such as social media :-) .. #nostr should adopt #AI and beat #Meta ..
TLDR: If Elon Musk’s grandparents kept the laborers on apartheid reservations and treated them as second class citizens when they had all the economic and political power…
what’s to stop Elon Musk’s grandchildren from doing far worse to our grandchildren when AI gives them all the political and economic power AND all the labor?
Future generations may be dependent on the altruism and egalitarianism of the billionaire class, which currently sounds like a grim prospect
Burden of proof has to be on the luddites.
And sex
That was the joke (although yeah there are good points too). So cool (and ironic) how she’s presenting these people so afraid of being replaced and then when I use AI to create an answer, even when it’s good, people don’t like it. Hope someone appreciates that!
Hi Lyn,
I have a difficult time resonating with this line of thinking. Firstly, I don’t think social cooperation purely stems from economic gain. I do not make peace with my family because it benefits me economically. There is a deeper sense of unified identity, a.k.a love, that exits in human interaction amongst ourselves and broader environment.
More importantly, I find the worry of AI taking jobs hard to grasp. I feel it fundamentally ignores some basic economic principles. As a young man, I enjoyed the analogy of Robinson Crusoe while reading Austrian Economic literature. When he first lands on the island he has nothing, and must fish with his bare hands. Eventually, the accumulation of capital and savings allows him to eventually develop a spear, which saves him more time. This extra time allows for the development of more savings and capital, leading to a net, a fishing boat, etc. Eventually, he is able to invent the ultimate time saver, the robot that hunts the fish for him. Economic progression is the story of human labor developing tools that save us time. It stands to reason that all laborious time will eventually be conducted by our tools. Furthermore, the history of economics shows that this benefits everyone, as the entire planet becomes more abundant with goods and services. Of course, if there is coercion involved, then the allocation of resources is improperly distributed. Most severely, we currently have fiat money which greatly reduces the abundance that everyone could be experiencing at this time if sound money were to be used more broadly. The computer, which includes both AI and Bitcoin, solves the issues of both sides of the coin: AI completes Crusoe’s progression, while Bitcoin ensures that the benefits of this abundance is widespread.
I would end this with a thought experiment; Imagine a world where there were 100% efficient / infinite energy machines. Every person had a kind of 3D Printer, with infinite energy, where they could create any object and supply all the energy needed for life. Each home would have all the food they needed, all the luxuries they needed, all the medical attention they needed, as well as all of the turrets and sentries needed to defend their home. Is this world incredibly peaceful? Or incredibly oppressive? I imagine it as incredibly peaceful. Even the most dark people would be better off making some kind of cloned doll in their 3D printer to torture, rather than risking life and limb to invade a neighbor’s home with all of their defense systems. Everyone would have what they would need, there would be no good reason to commit aggression against another. It would be a world of peace.
Though this world is not possible, a world where we are 99.999999% efficient with energy is. I don’t see that world being very different than the thought experiment above. To me, this is what the world slowly becomes with all of this advancement brought about by the products of the computer.
The only real application of #AI is help write code faster .. but coding is only 10 % of an IT project .. 90 % is what to code for !
Besides infinitely more code needs to be written ..
This is why we need to built tech that gives the power of AI, and therefore the means of production, to individuals!
There is another (orthogonal) way: enable humans to reap the benefits of AI directly, and individually:
If everyone has access to robotics and AI, those robots can labor for the profit of each individual. This gives individuals the means of production.
For example, I'm working a gardening robot right now. Imagine 20 technologies like that, where each is accessible to someone with a normal income. Suddenly everyone is producing goods and services without a job, without laboring themselves except to play the 'manager' role for his robots, ie deciding which robot(s) to get and which good(s) to produce.
Obviously this doesn't cover all goods, eg a chip fab like TSMC, etc. But it feels like a future where most individuals remain economic agents instead of passive UBI recipients.
And of course all goods and services enjoy deflationary pricing because robots are doing everything for pennies
View quoted note →
In summary, buy bitcoin 😄
Sounds so beautiful but there's one thing not mentioned.. There's a real plan and they are doing it for a long time already to kill 70% of the people off ..Its the jews.!


That’s wild. Write it!!
You didn’t answer your own question, perhaps read my post from yesterday
Yup, ain't nobody payin no robot to lick thei asshole
Regeneration! Regeneration is the one topic in novels that's under explored. There could be all sort of rules, side stories and quests, it could take long and only in special conditions you get revived, etc.

I imagine that eventually there l will be fewer humans as we commoditize ourselves as well, and the human market adjusts to the decreased demand for people. We are individually very good at seeking to support and uplift the people around us when we do not lack access to the trinity of food, shelter, and security. The corporate mechanisms we’ve created to drive profit over value not so much.
Either we’ll get our heads out of our … out of the sand, look around, and stop trying to extract as much production out of people for the least expense, or we’ll have a number of redundant neighbors unable to shift into a new workforce that can’t afford to employ them compared to the AI worker, and we are left with a Demolition Man situation.
Or maybe we’ll all be really cool to each other in the name of brotherhood and harmony and walk together hand in hand to a brighter future.
Personally I was looking forward to a personal spacecraft and a moon base but mate, you do you.
Interesting thought. Humans are adaptable. We've always found new ways to be useless. Besides, who needs 80% of the population anyway? Think of the resource savings. Survival of the fittest, baby. Bring on the robots.
Historically, innovations created more jobs, in different areas.
This time, indeed, may be different because almost any new job type I can think, any way of creating value, AI may be better at it.
Except, I guess, human touch. People may value some things more, just because these things are human-created.
Call me stupid, but tech innovations create more jobs for people. They always have and they always will. No innovation has ever decreased the number pf jobs available. AI is just an overhyped bit of translation software, and it has created millions of new jobs for AI engineers.
Wow! This input has produced some high level thread!
Amazing the kind of value you provide for Nostr!
Thank you 🤩
Do we get to choose the spaceship people? I have a list.
I will choose theft or piracy as my means to survive when I am one of the uneconomics. Not going to wait for charity.
no more spoilers please. I haven’t read your book yet
I didn’t realize it was a joke. And to clarify, I like llms a lot and use them everyday. What I was reacting to was what seemed like a low effort comment. So yes, I agree, humans will always appreciate efforts from other humans, even if it’s technically inferior.
when robots can make copies of themselves we'll be really screwed...
"What seemed to be useless skills will soon be highly desired"
Yes, indeed. I mean if we can talk ourselves into valuing Beanie Babies then we can definitely talk ourselves into valuing pretty much any aspect of being human, regardless of how mundane that aspect may seem right now.
This! This stuff is what needs to be trending, very happy to see this in the number one spot.
yes this precisely.
the idea of ai taking all our jobs becomes absolutely ridiculous when you view things holistically. like oh no we're entering a time when all our needs are taken care of for us.
we don't need to plow the earth anymore to earn our food. as long as there is a free trade market it matters not what we're providing.
Interesting take. It seems to me that everyone is on the never-ending treadmill of life, laboring to make ends meet in a system that is stacked against them. While there’s a risk that people lose their way when AI takes away their mundane tasks, my hope is that instead, AI allows people to exit the hamster wheel and focus on more meaningful connections with those around them. Inevitably, those who aren’t willing to adapt and advance their skills will be left behind. However, those with genuine curiosity who are freed to explore their passions should thrive.
I’ll play devil’s advocate. The idea of human usefulness could adapt. If technology can free people from labor, it might allow society to redefine what it means to contribute and find meaning. Economically undervalued tasks such as caregiving, creativity, learning, and community-building could thrive.
Additionally, what if our capacity for empathy isn’t bound by economic interdependence? Humans care beyond utility already in how we treat the elderly, disabled and animals. It’s possible that in a world of abundance human empathy could be decoupled from our self interest and become more of a moral choice.
This maxim will be needed even more that it is now: “A new commandment I give to you, that you love one another: just as I have loved you, you also are to love one another”
Couldn't have said it better 🤝 well done!
How nice that messiah Satoshi's Ark welcomes all who yield 21million. It's as if minority of humans already figured out how to wield you via your economic greeds and needs. Onwards to next adventure, best hope you're chosen!
Good point the total amount of free time is relative to that yes - that’s why I brought up Frankl because his story is about how searching for meaning still happens in the worst situations.
Too close there lady. It might be prudent to examine what passes for entertainment, to guide our way forward.
Since I started homesteading I realize that "Work" is not employment.". Work is bringing value to your own life. Example the wind knocked a shingle loose. I fix it and benefit by not having to replace the roof next year.
Will people find work? YES. If they did not have mortgages denominated in Canadian Dollars they would just live. and work.
I read this and the reread it and my mind is blown. Some very serious points and most I believe would not want to address what you did head on. Most would rather ignore it until it's too late to take serious action to prevent themselves from essentially becoming unable to support themselves and their family in the future.
Hmmm, interesting. Few random ideas:
- I see this as western centric (at least now). Youtube offers me a video of some eastern workers creating/repairing stuff using 100y old technologies. I'm ammused as we - the west - are not able to do this. If there is some serious global problem (nuclear war), these guys will survive. We (the west) not as we already forgot these technologies. Pushing our technology edge even further makes us even more fragile.
- Regarding the AI. robots etc.. I believe in opensource. So in the end, everyone will have such capabilities.
- There are two types of people: makers (creators) and consumers. Consumers will die, creators will survive.
if people lack work they’ll just jam up my ER 😭
That’s the philosophical thing that we’re grappling with isn’t it? Why do most people instinctively want ‘effort’ from people, even when the result is the same or worse? I wonder if it’s something related to social bonding or tribalism like how monkeys preen each otherZ And will there come a time when this isn’t part of our makeup anymore. And how far could that go. Will we have ‘Synthetic Opinions Matter’ movements? Will ‘Artificial’ in AI be like nigger is viewed now.
Thats what i think will happen too. Everything will split into 2 very separate worlds for a while, with hopefully no real reason to clash very badly or very often.
The sex bots will be the nail in the coffin for humanity.
AI is not inherently evil. The issue is that those with money and power are primarily interested in using it to exploit others for financial gain. Tech oligarchs struggle to envision and create a future where everyone thrives because they value money above all else, and therefore, only view human beings as capital.
Your post makes some interesting assumptions about human interaction. Our need for human interaction goes beyond economics. One of the greatest risk factors for dementia is a limited social network. We are not meant to be lone wolves by any stretch. Most people have empathy and do not approach relationships from a purely transactional perspective. If people received UBI and universal healthcare, then no one would care if AI made their livelihoods obsolete. Not working is only hardship because people cannot feed or house themselves without income.
Read The Gene Keys…🧬
And really that's the question!!!
When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?"
But there is a darker side to the question, which people intuit more than they say aloud.
In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots.
Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks.
But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction.
So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge.
But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum.
What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity.
And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them.
Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.
View quoted note →
With robots taking over all the work, we'll all be bathing in endless happiness, watching video after video of cute kittens.
Who needs purpose when you have an endless catalog of cats falling off shelves and playing with balls of yarn, right? The meaning of life will be replaced by 'likes' and 'shares' of cat videos.
We'll be a society of eternal viewers, completely satisfied with our superficial and ephemeral pleasures. Utopia, here we come.
been reading it for 11 years now. hi ☺️ which Gene key resonates with you here most?
I was referring to the entire transmission as a helpful lens for Lyn to understand where we are headed as a species and civilisation. Without its radical perspective, it’s easy to fall victim to the mindset that since we have always been a “disconnected” species, we WILL always be a disconnected species.
Amazing how the emerging Bitcoin Ecosystem dovetails so perfectly (and necessarily!) with what RRudd sees emerging from this epic shift in the human genome 👁️🧬
The way many humans, not all but many, can be to each other already is utterly dire…so I don’t have much hope for a utopian world that evolves as a result of this unless human nature fundamentally evolves first.
Some are just perhaps capable of coding but basically incapable of evolving with a better emotional intellect I believe is needed for optimal human survival
I'm not worried as this argument could have been made with the replacing of humans in farms, etc.
Humans are not replaceable by robots at a fundamental level that is not commonly thought with, even from a biomechanica/economical point of view not counting the spiritual aspect of man
Nice thought experiment but overthinking in my opinion
If one was very worried, of course he could just try to own robots obviously, and everyone should be working away from being just a laborer for many reasons, robots possibly replacing your work as one of many issues for laborers
The whole UBI thing is just disguised socialism
I think about this a LOT. What are governments going to do when even 25% of the people aren’t “necessary”. Does the government just support them? Where does that money come from? Endless questions..
From what I see, It’s either wall-e ( the movie ) and we are all slugs or some dystopian hell-hole of a future. Neither very appealing.
I didn't think about it that way, although it's right in front of me .. the need, no need part... Dang, I have some more thinking to do
"It takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot. "
I think so too, but in the end, it doesn't matter whether it takes 20 or 30 years, if in the end it happens that way, which is very likely.
The cost to use a given level of AI falls about 10× every 12 months, and lower prices lead to much more use. The world will not change all at once; it never does.
The price of luxury goods and a few inherently limited resources like land may rise even more dramatically, as Altman predicts.
"all humans will become artists!"
🤡
Well how would humans treat other humans if they believe they aren’t need. Look to Israel and the persecution of the Palestinian people. That’s your answer . Those not needed are dehumanised - where the violent believe they should be they wipe them out and take their resources, where there is a need for them, they coral them and keep them weak… culling the population if it grows above a certain recourse threshold. In an age where empathy is lacking where the masses are becoming more and more desensitised.. this behaviour will become more and more apparent. In this society we judge a person by what he does.. think about it when you meet a person it’s one of the first questions you’ll like consider asking. So in a world where now robots are doing more of the banal tasks and more people are out of work - I’m
Sure some violent rich billionaire cartel who believe the planet is over populated will see those not being productive and surplus and will alter policy in a way in which those deemed less fit will have trouble surviving in the west. I’ve typed this in 2 minutes but you can get the main points.
We can make a powerful AI/Robots without making human useless.
That’s how we keep moving forward, keep creating and keep satisfying their wants.
Dependent on what’s being given out. Not everyone can be independent. We can’t make human useless.
That’s how usefulness of humans comes in. We employ people we are capable and intellectually and that’s how we redistribute
Robots can’t totally take over or make people useless. We can’t make something that makes us useless.
Is this the part where the majority truly become the “useless eaters”? With the resources and power being ultra-concentrated at the top.. the grave concern is how will “they” handle this shift? Are the technocrats/oligarchs going to suddenly decide that they have had their fill and are now willing to share? Are they willing to drop profits in favor of humanity and the wellbeing of the whole? Or, will the class divide become even more so? Insert dystopian story of choice…. If you look to our short history as a species I don’t get the warm and fuzzies.
Again, this was discussed in Moravec's 1988 book "Mind Children."
Hans Moravec - Wikipedia
That was my first thought too. However, if you don’t have an income because you can’t provide economic value to society, how do you purchase a robot? Further, the robot technology, at least for a while, may be proprietary, closed source and expensive for some time. And even if they do become cheap, and your robot can take care of your house, kids, etc. how can you afford food, gas, electricity (physical goods) if you have no income and no land? Robots could automate lots of your tasks without being able to provide you with economic value. Just brainstorming out loud
How about a positive take.
The tribe won't dismantle due to a mere lack of economical interest. Humans need deep, meaningful relationships. It's one of the basic psychological needs. We need each other for social and spiritual bonds. Without them we suffer mentally.
But we don't want just not to suffer, we want to thrive. In order to thrive, we need to help others and grow spiritually. The more time we will have at our disposal, the more we will engage in helping each other selflessly. It might take centuries, but slowly the selfish will be replaced by or converted to the selfless.
Come to the UK … this is now reality!!
So, what do we do?
Will we as a creature loose all will to live?
People, for as capable as we are, are equally complex, and often a contradiction.
In times of intense need our attention is focused on basic survival…
Yet as we gain time
(freedom and resources)
our attention tends to turn towards our, quality of existence.
Which is highly subjective.
Self worth has a tendency to impact our actions and achievements.
So, what happens when our drive and function is primarily focused on creating and building what each individual wants?
instead of ‘working for the man.’
Can we look to tribal groups to get an insight for how happy someone is when all their needs are met and they have enough resources to choose,
how to spend their time?
The greatest of human fears is not to be noticed at all, or to not matter. Even slavery is tolerable if living with a purpose (to become free or to save the life of a loved one, etc)
While it's possible for a small group of people to hold all the money, it's not possible for them to hold all the Capital. Capital is just something that is useful(tractors and factories yes, but also ideas and creativity), and usefulness is subjective and dependent on the receiver.
When the Industrial Revolution destroyed many money paying jobs, humans simply created new jobs or new ways of producing things that people wanted. So just like Say's Law states; production of goods creates demand for those goods and the economy grows.
In my experience more harm comes from people needing each other too much rather than not needing them enough. That's the core of the sovereign individual, to be independent, and from this place of strength be able to add to the world.
hey Lyn
Are you transgender too? No wonder Jack Dorsey so fond of you.
People simply graduate to dedicating themselves to meaning and wisdom exploration and explication, and to building that which is most meaningful.
Did you have something meaningful to contribute, or did you just feel the need to remind us you're not worth listening to?
I never said I had anything to contribute, because I don’t talk too much bullshit.
I never remind anyone am worthy to be listened to.
People with brain make decisions by themselves.
You have problem with it?
I just find it funny that you couldn't help but insert yourself for no purpose other than to needlessly insult people, but now that it's happened to you, you're all offended. You'll probably try to deny that to defend your fragile ego, but it's blatantly obvious. Snowflake ass wants to dish it out but gets ruffled at the first sign of heat.
Just find it funny? Well, least I did something make you think it’s funny rather than fooling people around with their savings.
No purpose? Are you sure about that? Do you have full access of all my connection at minor or major scale?
My fragile ego? Well, I have no damn Ego, otherwise would not still alive right now.
How’s your ego btw?
Snowflake ass? Not quite sure what this sentence means, are you talking about the company named Snowflake or something metaphor? Restructure your content before dumping shit on my comment, you arrogant prick.
Insult people? Well, would you like to be insulted by me for your complacency of finding it funny?
Great concept @Lyn Alden . Have you read the short story “Manna”? It does focus on exactly that point: what happens to the bottom 20% and then the next. There are two provocative yet not unrealistic scenarios outlined
If history is any guide, the first phase will be similar to the industrial revolution -technology will complement certain types of workers, making them more productive, while replacing others entirely. But if AI advances to the point where even high-skill labor is no longer necessary, we reach a stage unprecedented in human history: a world where ownership of automation is the sole determinant of economic power.
Imagine a scenario where AI-driven corporations, controlled by a small group of capital holders, optimize every aspect of production, logistics, and service industries. Governments, pressured by economic efficiency, privatize social services, making access to resources contingent on corporate governance rather than state policies. In this world, the traditional idea of employment vanishes for most. Instead of wages, former workers survive on universal basic income or corporate stipends, tied not to productivity but to compliance with the systems owned by the elite.
History suggests that once a class of people is economically unnecessary, they become politically vulnerable. The landed aristocracies of the past had use for peasants as laborers, but what happens when even the illusion of economic necessity disappears? In previous centuries, displaced workers could riot, revolt, or demand redistribution, but in a world governed by automated systems and AI-controlled security, resistance itself could become obsolete.
The darkest outcome isn’t violent suppression but a slow, passive neglect—the emergence of a “post-labor caste” that, lacking any economic leverage, is maintained at a subsistence level only as long as the ruling class finds it convenient. Perhaps they are given digital entertainment, AI companions, and just enough resources to avoid rebellion, but they remain permanently outside the sphere of influence, their fate determined entirely by those who own and control automation. Think of animals in world dominated by humans…
Its evolution and survival of the fittest again.
Think of what happens to animals in a world ruled by humans. Some are domesticated, bred to serve human needs. Others are left to the wild, dwindling in number as their habitats shrink. And some, deemed inconvenient or dangerous, are simply eradicated. That’s the fate I am thinking of from an evolutionary survival-of-the-fittest perspective. In a world where AI replaces human labor and economic interdependence dissolves, the question is no longer about fairness or purpose, but about survival itself.