← Back to blog

Stop Firing People Because AI Exists

One person with AI can do the work of ten. But did anyone stop to ask what happens to that one person? Or what happens if you keep the ten?

Thinking Out Loud
Stop Firing People Because AI Exists

A friend of mine runs content for a mid-size SaaS company. Last year, her team was eight people. Writers, editors, a localisation specialist, someone who handled the knowledge base. Good team, solid output. Then the CEO attended a conference, came back fired up about AI, and within three months the team was down to three. The reasoning? "With AI tools, three people can produce what eight used to."

And technically, that's true. The three remaining people are producing roughly the same volume. Blog posts, help docs, product updates, internal comms. The numbers look fine on a dashboard.

But my friend hasn't slept properly in months. She's context-switching between writing, editing, prompt engineering, QA-ing AI output, managing translations, and doing all the strategic work that used to be shared across the team. Her two remaining colleagues are in the same boat. One of them is already looking for another job.

The company saved five salaries. It's also slowly losing the three people who actually know how things work.

The math that looks right but isn't

Here's the pitch that's been making the rounds in boardrooms since ChatGPT went mainstream: one person with AI can now do the work of ten. And if that's the case, why keep ten?

It's a compelling argument. Simple. Clean. Fits on a slide.

It's also dangerously incomplete.

Yes, AI can compress tasks. According to Microsoft's 2024 Work Trend Index, 90% of AI users at work say the tools help them save time. The heaviest Microsoft Teams users summarised eight hours of meetings using Copilot in a single month. That's a full workday reclaimed just from meeting summaries. And 85% say AI helps them focus on their most important work.

Those are real numbers. The productivity gains are not imaginary.

But here's what the "fire nine people" crowd never talks about: the person who remains doesn't just absorb the output. They absorb the cognitive load, the context, the decision-making, the coordination, the quality assurance, and every bit of institutional knowledge that walked out the door with those nine former colleagues.

Mental load is not a spreadsheet

There's a concept in psychology called cognitive load theory. It describes the total amount of mental effort being used in working memory at any given time. And every time you ask one person to do the thinking that five people used to share, you're not saving effort. You're concentrating it.

I think about this a lot when people tell me AI makes workers "10x more productive." Productive at what? Producing more words? Shipping more tickets? Generating more slide decks? Sure. But the actual hard part of knowledge work has never been the producing. It's the thinking. Deciding what to produce. Understanding context. Making judgment calls. Knowing when something is wrong even when it looks right on the surface.

AI doesn't do that for you. AI gives you a first draft, and now you need to be smart enough to evaluate it, experienced enough to catch the subtle errors, and present enough to notice when the output is confidently wrong. (If you've ever watched someone ship an AI-generated internal doc without reading it, you know exactly what I mean.)

Gallup's 2025 State of the Global Workplace report found that global employee engagement fell to 21% in 2024, down from 23% the year before. That drop cost the world economy an estimated $438 billion in lost productivity. Manager engagement dropped even harder, from 30% to 27%. Female managers saw a seven-point decline. Managers under 35 dropped five points.

These are the people who are supposed to be leading AI adoption. And they're burning out.

The amplification argument

Let me offer a different way to think about this.

If one person with AI can do the work of ten, then ten people with AI can do the work of a hundred.

Read that again. Because this is the part that almost nobody is talking about, and it's the part that should keep every CEO awake at night. Not because it's scary, but because it's an enormous opportunity that most companies are throwing away.

The companies laying off half their workforce because "AI can handle it" are not being efficient. They're being short-sighted. They're optimising for a quarterly headcount number while their competitors figure out what happens when you give powerful tools to a full team of motivated, experienced people.

I saw this play out at a startup event in Zurich last month. Two companies in the same space. Roughly the same size, same market. Company A had cut their content team from twelve to four. Company B had kept all twelve and given them AI tools plus training. Guess which one was producing multilingual content in six languages, running experiments with new formats, and shipping weekly product updates to their knowledge base? (It wasn't Company A.)

What actually happens when you cut

Let me walk through what happens in practice when you replace a team of ten with one or two "AI-enhanced" super-workers.

Week one feels great. The remaining people are energised. They have new tools. They're producing a lot. Leadership is thrilled. The dashboard numbers look incredible relative to headcount.

Month two, the cracks appear. The one person responsible for documentation discovers that AI-generated content needs serious review. Not light editing. Deep review. Because the AI doesn't know your product nuances, your customer context, or the three things you changed last week that invalidated half of what was written. The review work alone eats the time that was "saved" by generating content faster.

Month four, institutional knowledge gaps emerge. Remember those eight people you let go? They didn't just write content. They had relationships with product managers. They understood customer pain points from years of support ticket patterns. They knew which documentation topics generated the most questions. That knowledge is gone. The AI certainly doesn't have it.

Month six, you're hiring contractors. Because the remaining people are overwhelmed, quality has dropped, and someone finally noticed that the knowledge base hasn't been properly updated in weeks. But contractors don't have context either, so you're paying more per hour for worse results.

I'm not making this up. I've watched this pattern repeat at three different companies in the last year alone.

The data says keep your people (and train them)

The World Economic Forum's Future of Jobs Report 2025 asked over 1,000 global employers about their workforce plans. The numbers tell an interesting story. Yes, 40% of employers plan to reduce staff where AI automates tasks. But 85% plan to upskill their existing workforce. And 70% expect to hire people with new skills, not fewer people.

The report projects net job growth of 78 million by 2030. That's after accounting for the 92 million displaced roles. The world isn't moving toward fewer workers. It's moving toward differently skilled workers.

And here's the one that should give every "let's cut headcount" CEO pause: 64% of employers identified supporting employee health and well-being as a key strategy for talent availability. Not "reduce costs." Not "automate everything." Support well-being. Because companies that burn through their people don't get to hire the good ones later.

Meanwhile, a BCG and Harvard Business School study found that when teams used AI for creative tasks, around 90% improved their performance, with output quality rising 40% above control groups. But the study also found something that should make every leader uncomfortable: the diversity of ideas among AI-assisted groups dropped by 41%.

Think about what that means. You fire seven people from your ten-person team. The three who remain use AI to produce the same volume. But the range of ideas, perspectives, and approaches shrinks by nearly half. Your output looks productive but gradually becomes homogeneous. And nobody notices until a competitor ships something genuinely creative and you can't figure out why your team isn't doing the same.

The mental load nobody budgets for

Microsoft's survey found that 68% of people struggle with the pace and volume of work, and 46% feel burned out. And this was the state of affairs before you told them they're now doing the jobs of their three former teammates.

Here's something that doesn't show up in productivity dashboards: the cognitive cost of being the last line of defence. When you're the only person reviewing AI output, you don't get to have an off day. When you're the sole owner of the knowledge base, every support question lands on your desk. When there's nobody to bounce ideas off because the team was "right-sized," every decision is yours alone.

I've been building Rasepi partly because I've seen this problem up close. When documentation teams shrink, the knowledge doesn't shrink with them. The amount of content that needs to exist, stay current, and be accurate across languages doesn't decrease just because there are fewer people maintaining it. If anything, it grows (this is exactly the problem we're building Rasepi to solve, by the way, with features like forced expiry dates and block-level translations that make smaller teams genuinely more effective rather than more overwhelmed).

But even the best tools don't fix a fundamentally broken staffing decision. You can't automate away the need for human judgment, context, and care. You can only make those humans more effective.

What smart companies actually do

The most impressive thing about the Microsoft data is what the "AI power users" look like. These are people who use AI multiple times a day and save over 30 minutes. They're 68% more likely to experiment with different ways of using AI. They don't just generate more output. They redesign how work happens.

And here's the kicker: they exist within organisations that invest in them. AI power users are 61% more likely to hear from their CEO about the importance of AI at work. They're 53% more likely to receive encouragement from leadership to rethink their entire function. They get tailored training, not just a ChatGPT login.

In other words, the most productive AI workers aren't lone survivors of a layoff. They're members of supported, invested-in teams.

Let me contrast that with what I see at companies that took the "cut headcount" route. Their remaining employees aren't power users. They're overwhelmed generalists desperately trying to keep things running. They don't have time to experiment with AI because they're too busy using it for survival. There's no rethinking the function because the function is just... them, alone, doing everything.

The knowledge problem nobody mentions

There's one more thing. And I don't hear anyone talking about it, which is odd because it should be obvious.

When you fire experienced knowledge workers, the knowledge leaves with them. It does not stay in the building. It's not in the wiki. It's not in the AI. It's in the heads of the people who built the processes, understood the edge cases, and knew which customers cared about which details.

You know what happens when you have great AI tools and no institutional knowledge? You get beautifully formatted, confidently delivered, completely wrong information. At scale.

I talked to a head of documentation at a fintech company last month (she didn't want to be named, which tells you something). After their team was cut from six to two, they started relying heavily on AI to maintain their developer docs. Within four months, they noticed a spike in support tickets. The docs looked fine. They were well-written, up to date on the surface. But they contained subtle errors that only someone with deep product knowledge would have caught. An API parameter description that was technically correct but practically misleading. A migration guide that missed a step everyone on the old team just knew about. Little things that AI can't know because AI doesn't attend your standups, doesn't read your Slack threads, doesn't hear the frustrated "oh, that doc is wrong again" from the support engineer at the coffee machine.

The real question

So here's what I think the conversation should actually be about.

Not: "How many people can we cut now that we have AI?"

But: "What becomes possible when we give AI to everyone we already have?"

Your ten-person documentation team with AI tools doesn't become redundant. It becomes a team that can maintain content in twelve languages instead of two. That can keep every piece of content current with automated freshness checks. That can experiment with new formats, run A/B tests on help content, build interactive guides, and still have time to think strategically about what customers actually need.

Your ten-person marketing team with AI doesn't become five people doing the same work with more stress. It becomes ten people who can personalise campaigns at a scale that was previously impossible, test more creative variations, respond faster to market changes, and still have the cognitive bandwidth to come up with genuinely original ideas that the AI never would have generated.

That's not a cost. That's an investment with a return that compounds.

Where this ends up

The companies that win the next five years won't be the ones who cut the most heads. They'll be the ones who figured out how to make their existing teams genuinely more capable.

The question isn't whether one person can do the work of ten. The question is what happens when all ten can do the work of a hundred.

If you're a leader reading this, I'd ask you one thing. Before you approve that next round of "AI-enabled restructuring," talk to the people who stayed after the last one. Ask them how they're doing. Ask them what they've stopped doing because there's no time. Ask them what's falling through the cracks.

And then imagine what they could accomplish if, instead of carrying the load alone, they had a full team and the best tools available.

That's not a fantasy. For the companies willing to invest in their people instead of replacing them, that's the next twelve months.

Keep your docs fresh. Automatically.

Rasepi enforces review dates, tracks content health, and publishes to 40+ languages.

Get started for free →