Will AI Take Over Jobs? What History and Honesty Tell Us
In 1811, a weaver named Ned Ludd supposedly smashed two stocking frames in a fit of rage.
Whether this actually happened is unclear, Ludd might be entirely fictional. But within months, textile workers across England were destroying industrial machinery and signing their actions "Ned Ludd" or "King Ludd."
The Luddites weren't stupid. They weren't anti-technology. They were skilled craftsmen watching their livelihoods mechanized, their expertise made obsolete, their communities shattered by industrial efficiency.
And history remembers them as fools who tried to stop progress.
Two hundred years later, we're having the same conversation. Except this time it's not textile frames, it's knowledge work. And it's not just physical automation, it's cognitive automation.
Will AI take over jobs?
The honest answer is: yes, obviously. The question isn't whether, but which jobs, how fast, and what happens to the people whose work disappears. (Please bear in mind that this comes from a lawyer who once believed his industry would be among the last to be automated away, but it ended up becoming one of the very first.)
And whether calling this "progress" is honest or just comfortable language for people who aren't losing their livelihoods.
🜏
What's Already Happening (That We're Pretending Isn't)
Let me start with what's not speculation but observable reality:
Customer service roles are being automated rapidly. Chatbots handle tier-one support. AI systems resolve common issues without human intervention. The humans who remain handle only the complex edge cases, and there are fewer positions because AI handled the volume.
Translation work has fundamentally changed. Professional translators now spend their time editing machine translations rather than translating from scratch. The skill required shifted. The number of positions contracted.
Certain creative work is being displaced. Stock photography, generic graphic design, basic copywriting, these are increasingly AI-generated. Humans still handle the premium work, but the entry-level positions that used to train those premium workers are vanishing.
Routine analysis is being automated. Financial analysts, paralegals, medical coders, work that involves pattern matching in structured data is being systematically automated. The remaining humans focus on judgment calls and client relationships.
Software development is changing. AI coding assistants generate boilerplate code, identify bugs, and suggest solutions. Junior developers who once spent years learning through repetitive coding now face a landscape where that training ground is automated.
The legal profession is feeling the same pressure. AI tools now review contracts, draft motions, conduct legal research, and handle discovery faster and cheaper than junior lawyers ever could. The work still exists, but the number of people needed to perform it is shrinking, and entire categories of entry-level legal tasks are disappearing.
None of this is hypothetical. It's happening now. Not everywhere, not all at once, but the direction is clear.
And we're still having theoretical debates about whether AI will impact employment while people are already losing positions to automation.
🜏
The Pattern That Keeps Repeating
Here's what happens every time technology automates work:
Stage 1: Denial "This technology won't replace humans. It's just a tool that will make us more productive."
Stage 2: Partial displacement Some jobs disappear. Proponents say this is actually good, those were tedious jobs nobody wanted anyway. New positions will emerge. Just retrain.
Stage 3: Uneven distribution of pain People with capital, education, and mobility adapt. People without these advantages struggle. Communities built around displaced industries collapse.
Stage 4: Eventual adaptation Decades later, the economy has restructured. New jobs exist that didn't before. Living standards might be higher overall. But the people who bore the cost of transition are still bearing it.
Stage 5: Repeat
We've seen this pattern with:
Agricultural mechanization (displaced millions of farm workers)
Factory automation (eliminated manufacturing jobs)
Computerization (automated clerical work)
Offshoring (relocated jobs to cheaper labor markets)
Each time, the economy eventually adapted. Each time, the aggregate numbers eventually looked positive. Each time, the individuals and communities disrupted paid enormous costs that never fully recovered.
And each time, we told ourselves: "This time it's different. This time we'll manage the transition better."
We never do.
Because managing the transition well requires prioritizing people over efficiency. And efficiency wins, every time.
🜏
Which Jobs Are Most Vulnerable (The Uncomfortable Truth)
Research attempting to predict AI impact focuses on tasks, not just job titles. Any work involving:
Routine cognitive tasks – Following established procedures, processing structured information, applying rules to reach conclusions. This includes much of administrative work, basic accounting, medical coding, paralegal document review, routine financial analysis.
Pattern recognition in narrow domains – Radiologists reading scans, quality control inspectors, fraud detection, credit evaluation, insurance underwriting. Anything where the task is identifying patterns in constrained contexts.
Content generation at scale – Journalism covering routine events (sports scores, earnings reports, weather), basic copywriting, stock imagery, generic graphic design, simple video editing.
Customer interaction with scriptable responses – Support calls following decision trees, sales qualifying leads, appointment scheduling, basic tutoring in structured subjects.
Code generation for standard patterns – Junior developer work writing boilerplate code, debugging common errors, implementing standard features from specifications.
What these jobs share: they can be broken down into patterns that AI systems can learn.
But here's what the research often misses:
It's not that these entire jobs disappear immediately. It's that the number of humans needed to do them drops dramatically. If AI handles 80% of a workflow, you don't need 80% fewer workers, you might need 50% fewer, or 30% fewer, depending on bottlenecks.
The result isn't binary replacement. It's employment contraction.
Five radiologists become three. Ten junior developers become four. Twenty customer service reps become eight.
Not everyone loses their job. But enough people do that the pain is real and widespread.
🜏
What Jobs Will AI Take Over? The Honest Assessment
Let me be direct about specific categories:
Highly vulnerable (10-15 years):
Basic content writing (product descriptions, routine articles, simple scripts)
Data entry and processing
Telemarketing and routine sales calls
Travel agents for standard bookings
Simple bookkeeping and tax preparation
Basic graphic design and photo editing
Routine coding and software testing
Customer service for common issues
Paralegal document review
Basic tutoring in standardized subjects
Moderately vulnerable (15-25 years):
Radiology and medical imaging analysis
Financial advising for standard portfolios
Insurance underwriting
Market research and analysis
Translation and interpretation
Truck and taxi driving (pending regulatory and infrastructure changes)
Warehouse and retail inventory management
Basic legal research and contract drafting
Routine journalism and reporting
Probably safe (25+ years or longer):
Anything requiring physical dexterity in unstructured environments
Work requiring deep empathy and emotional intelligence
Strategic decision-making with high uncertainty
Creative work requiring genuine novelty
Jobs requiring building trust and relationships over time
Physical trades (plumbing, electrical, carpentry) in varied environments
Healthcare requiring human touch and judgment
Education involving mentorship and motivation
Scientific research requiring intuition and paradigm shifts
But every prediction here is uncertain.
I could be dramatically overestimating timelines. Capabilities are improving faster than almost anyone expected five years ago.
Or I could be underestimating the stickiness of human preference for human interaction, regulatory barriers, and technological limitations.
What I'm confident about: job displacement is coming. The specifics are uncertain.
🜏
The Economic Question Nobody Wants to Answer
Here's the uncomfortable truth economists avoid:
Previous technological revolutions created new jobs because they expanded what was economically possible. AI might not.
When agriculture mechanized, displaced farmers moved to factories. When factories automated, workers moved to service sectors. When services computerized, knowledge work expanded.
Each transition worked because humans still had comparative advantage in something valuable.
But AI isn't just automating manual labor. It's automating cognitive labor. The thing humans have historically moved toward when other work was automated.
What do humans move to when cognitive work is automated?
The optimistic answer: creative work, emotional labor, strategic thinking, relationship building, the uniquely human capabilities.
But is there enough demand for these to employ everyone displaced from cognitive work? Can everyone transition to these roles? Will they pay enough to maintain current living standards?
I don't know. And neither does anyone else claiming certainty.
What we do know from economics: When supply of labor increases dramatically in any sector, wages in that sector drop.
If millions of people displaced from automated jobs move to "human-only" sectors, those wages will likely decrease from increased competition.
The optimists say: but AI will create economic growth that generates new opportunities.
Maybe. Or maybe AI will create economic growth that primarily accrues to capital owners while labor's share of income declines further.
This isn't pessimism. It's asking whether the historical pattern of "technology creates more jobs than it destroys" will hold when technology can do cognitive work.
And that's genuinely uncertain.
🜏
What Actually Happens to People (Beyond the Statistics)
Let me talk about what job loss actually means for humans, beyond employment numbers:
Identity disruption. For many people, their work is central to their identity. "I'm a lawyer." "I'm a programmer." "I'm a journalist." When that work disappears, the identity question becomes: "What am I now?"
Community fragmentation. Jobs create social connections, daily structure, shared purpose with colleagues. Remote AI work or gig economy jobs often lack these. The social infrastructure around work matters beyond the paycheck.
Skill obsolescence. Someone who spent 20 years mastering medical coding, or building expertise in a specific type of legal analysis, faces not just job loss but the reality that their expertise has become worthless. Retraining sounds simple until you're 50 years old with a mortgage.
Geographic concentration. If you're a manufacturing worker in a town whose factory closes, you can't just move, you're tied by family, community, housing you can't sell. When entire industries in specific regions get automated, those regions collapse.
Psychological impact. Study after study shows unemployment, especially long-term unemployment, correlates with depression, anxiety, substance abuse, and even suicide. This isn't about the paycheck, it's about purpose, dignity, and social participation.
The statistics say "displaced workers eventually find new employment." But what the statistics don't capture:
At what wage compared to previous employment?
With what psychological toll?
After how many years of struggle?
With what long-term impact on families and communities?
When we say "AI will take over jobs," we're really saying: "Millions of people will have their livelihoods eliminated, their expertise devalued, their communities disrupted, and their identities challenged."
And then we expect them to "retrain" as if that's a simple matter of attending a few courses.
🜏
The Retraining Fantasy
Every discussion of AI and employment eventually lands on: "Don't worry, people will retrain for new jobs."
Let me be blunt: this is mostly fantasy.
Retraining works when:
The person has financial stability during the training period
The new field is hiring and has clear career paths
The skills required are within the person's capability range
The training period is short enough to be economically viable
Geographic mobility is possible
The new jobs pay comparably to the old ones
How many displaced workers have all these conditions? Very few.
A 45-year-old paralegal with two kids and a mortgage can't afford to go back to school for four years. A truck driver whose job gets automated can't easily transition to software development, it's a different cognitive profile entirely.
And retrain for what, exactly?
If AI is automating cognitive work broadly, what's the stable field to retrain into? The healthcare jobs that require human touch? How many of those are there, and do we really want millions of people forced into caregiving roles they're not suited for?
The skilled trades? That works for some people, but not everyone has the physical capability or interest. And trade schools take years and cost money.
The retraining narrative is comforting for people whose jobs aren't threatened. It's much less comforting for people actually facing displacement.
It's our way of avoiding collective responsibility for managing transitions. "Just retrain" puts the burden entirely on individuals rather than acknowledging that massive technological displacement requires systemic responses.
🜏
What We're Not Willing to Do About It
If we were serious about managing AI's impact on employment, here's what we'd need to do:
Massive investment in transition support. Not just training vouchers, but full income support during extended retraining periods. Something like 2-3 years of salary replacement while people genuinely prepare for new work.
Geographic relocation assistance. If jobs exist elsewhere, people need financial support to move, not just to the new location, but help selling houses in collapsed markets, support for family transitions, etc.
Universal basic services. Healthcare, childcare, education decoupled from employment so people can retrain without losing everything.
Job guarantees or public employment. If the private sector doesn't generate sufficient employment, public investment in caregiving, infrastructure, environmental restoration, community services.
Significant taxation of AI productivity gains to fund these supports. The economic benefits of AI automation can't just flow to capital owners; they need to fund support for displaced workers.
Aggressive antitrust and market regulation. If AI concentrates economic power in a few large tech companies, that power needs to be checked to prevent runaway inequality.
Are we doing any of this? Barely.
We're having committee meetings and pilot programs while the displacement accelerates.
Why? Because these interventions are expensive, require political will, and challenge existing power structures.
Much easier to say "the market will figure it out" and "people just need to retrain."
🜏
Will AI Take Over Jobs? The Answer Nobody Wants
Yes. Obviously yes.
Not every job. Not instantly. Not uniformly across all sectors and geographies.
But enough jobs, fast enough, that millions of people will experience real displacement, reduced wages, or forced career transitions they're not prepared for.
The question isn't whether this happens. The question is whether we manage the transition humanely or allow it to be brutal.
And current indications suggest: brutal.
Because managing it humanely requires:
Accepting reduced efficiency for social stability
Massive public investment in support systems
Regulation that constrains technological deployment
Willingness to redistribute AI's economic gains
Collective responsibility for individual displacement
None of these are politically popular in most countries right now.
What's popular is: maximize AI deployment, let markets allocate the gains, tell displaced workers to retrain, blame individuals for not adapting.
This is the path we're on. And it will create enormous suffering for people caught in the transition.
The Luddites lost. Industrial automation proceeded. The weavers who could adapt did. The ones who couldn't faced poverty, displacement, and communities destroyed.
History remembers them as fools because the people who won got to write the history.
But the Luddites weren't wrong that technological change can destroy livelihoods. They were just powerless to stop it.
🜏
What I Actually Think Will Happen
Here's my honest assessment, stripped of optimism and pessimism:
Near term (5-10 years): Significant displacement in customer service, content generation, routine analysis. Wages stagnate or decline in knowledge work as AI augmentation reduces labor requirements. Growing inequality as AI gains accrue primarily to capital owners.
Medium term (10-20 years): Broader automation of cognitive work. Transportation, medical diagnostics, legal analysis increasingly AI-performed. Labor force participation rates drop. Growing political tension around inequality and unemployment.
Long term (20+ years): Either: We develop new economic models (UBI, job guarantees, radically reduced work weeks) that enable human flourishing without full employment.
Or: We face massive social instability, political extremism, and conflict over how to distribute AI's economic benefits.
Which path we take isn't determined by the technology. It's determined by political choices we're making now about how to distribute AI's gains and whether we take collective responsibility for managing displacement.
Current trajectory suggests we're headed toward the second path. But that's not inevitable, just likely given current policy directions.
🜏
Where This Leaves Me
I can't give a comforting answer to "will AI take over jobs?"
Yes, it will. Not completely, not instantly, but enough to cause real hardship for millions of people.
No, we're not prepared for it. Our social support systems, retraining programs, and political will are inadequate for the scale of disruption coming.
Yes, it could be managed better. But that requires collective choices we're not currently making.
The honest conversation isn't whether jobs will be lost. It's whether we'll take responsibility for the people whose livelihoods are automated.
And right now, the answer seems to be: no, we'll tell them to retrain and blame them if they don't succeed.
That's not a technology problem. It's a political and moral choice.
The Luddites understood that technology doesn't just create abstract economic gains. It reshapes human lives, communities, and possibilities in ways that create winners and losers.
We've spent two centuries mocking them for opposing progress.
Maybe it's time to acknowledge they were asking the right question: Progress for whom? At whose cost? And is the price worth it?
I don't have answers that make anyone comfortable.
But I think those are the questions we should be asking while we still have time to shape the outcomes.
— N.H.
Further Reading:
Daron Acemoglu & Pascual Restrepo - Research on automation and labor markets
Carl Benedikt Frey & Michael Osborne - "The Future of Employment" (2013)
David Autor - Work on labor market polarization
Andrew Yang - The War on Normal People (automation and UBI)
Historical analyses of Luddite movement and technological unemployment