When “Efficiency” Means Admitting You Were Wrong
What we are getting wrong about AI
At Decode Econ, we are feeling a bit disappointed. The Department of Labor won't issue its report on U.S. hiring in January on Friday as planned because of the partial government shutdown. The January report will come out on Wednesday. The labor market is on the top of our minds these days. So here is a post from Antowan Batts on AI and layoffs, and what CEOs got wrong but won’t admit.
-Dr. A
Amazon just laid off 16,000 white-collar workers worldwide. The company cited automation investments and “rightsizing.” UPS announced cuts of up to 30,000 positions this year, blaming decreased revenue after Amazon reduced its contract volumes. And across the tech sector, companies are pointing to AI’s “potential” as justification for letting workers go - even when that potential hasn’t materialized into actual performance gains.
Here’s the plot twist: 55% of those companies now admit they regret those layoffs. They’re quietly rehiring. Did the automation revolution justify cutting workers? Turns out it was more expensive and less effective than letting the people go.
If this sounds familiar, it should. We’ve seen this movie before - just with different technology buzzwords. The metaverse. Blockchain. Digital transformation. Each time, executives discover that the actual work still needs people who know how to do it. Each time, workers pay the price for executive miscalculation.
But there’s a framework that explains exactly what’s happening here, and it’s not about AI at all. It’s about something economists warned us about decades ago: malinvestment. And understanding it might be the most important thing workers can do right now.
The Real Story Behind the Layoffs
Let’s be clear about what actually happened. The pandemic injected massive spending into the economy. Companies scrambled to capture that upside, hiring aggressively to meet what appeared to be permanent shifts in demand. Amazon did it. UPS did it. Pretty much every major tech and logistics company did it.
Then reality hit. That demand wasn’t permanent. It was artificially inflated by stimulus spending and lockdown behaviors. As the economy normalized, companies found themselves overstaffed for actual sustainable operations. They needed to correct.
But here’s where it gets interesting: instead of admitting they miscalculated during the boom, executives started using AI as cover for cost-cutting. Investors love hearing about “efficiency” and “future technology investments.” It sounds so much better than “we hired too many people chasing temporary pandemic demand, and now we need to fix our balance sheet.”
The problem? CEOs are now realizing they acted too hastily. The increased costs of implementing these new operations aren’t delivering the efficiency and profit gains they promised. The automation that was supposed to replace workers can’t actually do what workers do.
So they’re rehiring. Quietly. Without the press releases that accompanied the layoffs.
Understanding Malinvestment
This is where Austrian economics becomes useful. The concept of malinvestment, developed by economists such as Ludwig von Mises and Friedrich Hayek, helps explain what we’re seeing unfold in the tech sector right now.
Malinvestment occurs when capital gets allocated based on distorted market signals rather than genuine economic fundamentals. During the pandemic, we had a perfect storm: near-zero interest rates made borrowing cheap, stimulus spending flooded the economy, and companies scrambled to capture what appeared to be permanent demand shifts. The result? Massive hiring sprees, aggressive expansion, and investments that looked smart in 2021 but were unsustainable by 2024.
The Austrian economists understood something important: artificial booms create artificial investments. When interest rates were essentially zero, companies didn’t have to be as disciplined about returns. They could afford to experiment, overhire, and chase growth metrics. But as rates normalized and capital became more expensive, those same investments started looking very different.
This is why we’re seeing the pattern of layoffs followed by rehiring. Companies overshot during the boom, corrected too aggressively while using AI as cover, and now they’re discovering they actually need those workers to maintain operations. It’s not that AI failed - in many cases, it was never seriously implemented. It was just a convenient narrative for cost-cutting that investors wanted to hear.
What This Actually Tells Us
Here’s what matters: when 55% of companies admit they regret their AI layoffs and start rehiring, they’re admitting something fundamental. The work still needs to be done. The value still needs to be created. And workers are the ones who create it.
AI didn’t replace you. Cheap capital and poor executive decision-making created a cycle of malinvestment, and workers paid the price. Not because they weren’t valuable. Not because they weren’t needed. But because executives made bad bets during a boom and used technology narratives to cover their tracks during the correction.
Every time you read about a company “investing in AI” right before layoffs, remember this: they’re not replacing human capability. They’re trying to explain away their own miscalculations. And when those same companies quietly start hiring again months later? That’s them admitting what they knew all along - the actual work requires actual people.
Your labor has value. Your skills have value. Your expertise has value.
The Bottomline
The companies currently cycling through layoffs and rehires aren’t discovering that AI can’t do your job. They’re discovering that they never actually had a plan beyond using buzzwords to satisfy investors. They’re learning what workers already know: the value in organizations comes from the people who understand operations, serve customers, solve problems, and keep things running.
So if you’ve been laid off in this cycle, understand what really happened. You weren’t replaced by technology. You were caught in a correction cycle caused by executive-level malinvestment. Your skills didn’t become obsolete. Someone just made a bad calculation about interest rates, demand sustainability, and what “efficiency” actually means.
And if you’re still employed but watching colleagues get cut while executives talk about AI transformation, remember this: when the dust settles and companies realize they still need the work done, they’ll be looking for the people who know how to do it. That’s you. That’s always been you.
The technology will keep changing. The hype cycles will keep turning. But the fundamental truth remains: value gets created by workers who show up and do the work.
Everything else is just executives trying to explain their decisions to shareholders.
From Marketplace
Our friends at Marketplace had a similar story this week. You might be interested in hearing their views on AI and layoffs. Read more






So familiar . Thanks Antowan for a great topic. Something about your writing astonishes me . How you mix the Scientific facts with the art of writing. Every time I read your work I feel the same. Humanities with Science. A great mix makes me enjoy the topic. Many thanks.
I asked Perplexity the following, and received the response that follows:
Q: there is a pattern: CEOs lay off many workers because they think AI will make their business "more efficient", then hire them back when they realize that people who know how to do things are necessary; generals buy many expensive complex weapons, then realize they have to improve the training of soldiers to win wars; leaders think they can run a country if businessmen support them, but they discover that workers are important too.. What does this say about the psychology of men in positions of power?"
Psychologically, these patterns point to a recurring bias in powerful people: they tend to treat others as interchangeable tools rather than as knowledgeable partners, then are forced by reality to rediscover their dependence on those very people.
Key traits of powerful decision‑makers
Instrumental view of people: Power makes it easier to see others in terms of their usefulness to a goal (cost, efficiency, votes) rather than as full humans with skills, limits, and needs. That instrumental mindset supports layoffs for “AI efficiency,” buying dazzling weapons, or courting business elites while neglecting workers.
Reduced perspective‑taking: Experiments show that priming people with power decreases their tendency and accuracy in understanding others’ thoughts and feelings. That makes it easier for a CEO or general to believe a clean spreadsheet or weapons brochure instead of listening to front‑line experience.
Overconfidence and distance from reality
Overconfidence in abstractions: Leaders with power often trust models, dashboards, and doctrines over messy human experience, until failures force them to adjust (rehiring workers, retraining soldiers, rebuilding social support). They lean on simplified stories: “AI replaces labor,” “technology wins wars,” “capital creates prosperity,” underestimating the complex systems those people actually run.
Psychological distance from consequences: Because they rarely bear the immediate personal cost of bad bets, powerful people can persist longer in mistaken strategies like over‑automation or over‑weaponization before reality becomes undeniable.
Dependence they don’t like to admit
Reluctant recognition of interdependence: The cycle you describe—discard people, then discover they’re indispensable—shows a clash between the fantasy of control and the fact of dependence on skilled, motivated humans. Power tempts leaders to imagine they stand above the system, but crises remind them they are embedded in it and need the same workers and soldiers they tried to replace or ignore.
Status bias toward elites: Psychologically, many leaders overweight the perspectives of those who resemble them in status (other executives, wealthy backers, technologists) and underweight those of lower‑status groups (workers, enlisted troops, marginalized citizens), even when the latter hold critical practical knowledge.
What this pattern suggests
At a deep level, these patterns suggest that many men in positions of power struggle with empathy, humility, and acknowledgment of mutual reliance, and default instead to control, abstraction, and prestige alliances.
The repeated need to “walk back” grand efficiency or technology schemes shows that human knowledge, judgment, and morale are not just nice extras but the core infrastructure that power keeps trying—and failing—to replace.