I survived the dot-com era, where every business plan acquired a “.com” the way ships acquire barnacles: automatically, without anyone deciding it was a good idea. I have survived, and occasionally not survived, more re-orgs than I can count, which is either a badge of honor or a warning sign. I watched entire engineering organizations migrate to the cloud because AWS existed.

Each time, the pattern was the same. The press release drops, the board asks a question, a competitor makes an announcement, and the organization attempts to move. Not because it has a reasoned view of what this means for its specific context, but because standing still felt like losing. I have moved in the wrong direction and experienced the consequences. I have also, on occasion, succeeded and helped others do the same.

So the AI transformation is locked, loaded, and ready to start blasting. To quote Jean-Baptiste Emanuel Zorg: “I know this music. Time to change the beat.” That’s not to say AI isn’t real or genuinely consequential; I think it is, more than most of the previous waves. But many of these AI transformation attempts are going to fail for the same reasons all the previous transformations failed: org dysfunction, process theater, and mistaking activity for progress. The AI part is almost incidental.

What follows is a list of failure modes I’ve either watched up close or participated in directly. I want to be part of the solution, not part of the problem, but mileage may vary. At least it’s documented, so two years from now I can either point and laugh at how wrong I was, or buff my “I told you so” clout.


1. Nobody defined the problem first

“We need to be an AI company” is not a problem statement.

Transformations that kick off before anyone has articulated what specific problem AI is supposed to solve are destined to measure activity instead of outcomes. You can’t know if it worked if you never said what working would look like.

This has a historical parallel. In the late nineties, adding “.com” to your business plan was treated as a strategy. Companies with no coherent answer to “what internet problem are you solving?” raised money, hired fast, and failed anyway. Allbirds, a shoe company and a good one, recently announced it was becoming an AI company. I have nothing against Allbirds. I own a pair. But I genuinely cannot tell you what problem their AI strategy is solving, and I suspect neither can they. “We’re an AI company now” is the same move in a different cycle. The identifier has replaced the argument.

2. You hired a Chief AI Officer and declared victory

Role creation is the oldest substitute for strategy. The new title signals seriousness to the board, absorbs the anxiety, and gives everyone permission to stop thinking about it. Meanwhile, nothing about how decisions get made has changed.

In the Hitchhiker’s Guide to the Galaxy, the Golgafrinchans solved their useless-people problem by convincing their telephone sanitizers, middle managers, and marketing consultants that the planet was doomed and they needed to evacuate immediately on the B Ark. The trick worked perfectly. The B Ark launched. Everyone felt like something serious had happened. The Golgafrinchans who stayed behind were eventually wiped out by a disease contracted from a dirty telephone, which Douglas Adams presents as a reasonable outcome for everyone involved.

Hiring a Chief AI Officer has a similar energy. A visible, enthusiastic, expensive person is now in charge of the thing. The board is satisfied. The anxiety is absorbed. The Chief AI Officer may be this cycle’s Shingy: AOL’s “Digital Prophet,” attached to no measurable outcome. If your new CxO has started dressing like Sam Altman, that’s a tell.

David Shing, AOL's Digital Prophet, speaking at a conference

David Shing (AOL). Photo: Jarle Naustvik / CC BY 2.0

3. Your pilot has been running for 18 months

Pilot purgatory: the initiative that’s always almost ready to scale. It succeeds enough to avoid cancellation, fails enough to avoid commitment. Organizations use pilots to feel like they’re transforming without actually having to.

Edward Yourdon wrote the book on this, literally. It’s called Death March, published in 1997. It describes the project where everyone involved knows it’s doomed, the timeline is fiction, and the definition of success shifts every quarter, but nobody calls it because calling it is worse than continuing. The pilot becomes its own justification. It has a Slack channel. It has a demo environment. It has a steering committee. It is, in every meaningful sense, a small civilization that has forgotten why it was founded.

I will say this for the Death March pilot: the engineers usually have a good time. The constraints are loose, the problem is interesting, and nobody is quite sure what done looks like, which means you get to play with genuinely cool technology for an extended period while calling it work. There are worse fates. Shipping is one of them, because then you have to support it.

At some point, a pilot that doesn’t ship is just a no that nobody had the courage to say out loud. The question is whether anyone has the standing to say it.

4. You automated your broken processes

Most processes in most organizations are not broken in the way a machine is broken. Duct tape holds them together: informal workarounds, tribal knowledge, and the institutional memory of people who know that you have to CC that one person on the approval email or the whole thing stalls, because of something that happened in 2019 that nobody fully documented. The process as written is not the process as practiced. The delta between them is the duct tape.

The Millennium Falcon is the canonical duct-tape system. It shouldn’t work. The official specs say it shouldn’t work. Han Solo will tell you it works fine while hitting a specific panel in a way that only he knows. Replace Han and Chewie with an automated system running off the official Corellian Engineering Corporation maintenance manual and you won’t make it out of the docking bay, let alone the Kessel Run.

AI doesn’t fix broken processes; it executes them faster with more confidence. What it actually automates, if you’re not careful, is the documented process: the clean, official version that nobody really follows. You strip out the duct tape, run the pristine workflow at scale, and discover approximately six weeks later that the duct tape was structural.

If your procurement workflow was a mess before, your AI-assisted procurement workflow is a faster, more confident mess. The failure mode is invisible until it isn’t.

5. Your AI surfaces insights nobody acts on

Before you deploy the AI dashboard, I’d ask one question: is anyone actually looking at the dashboards you already have? In my experience the answer is usually “sort of” at best and “we forgot that existed” at worst. The problem was never access to information. It was what happens, or doesn’t happen, after the information arrives.

You built the dashboards. You deployed the models. The insights are accurate and the reports are beautiful and nothing changes.

The control room at Chernobyl was full of instruments. The operators could see the readings. The data was there, accurate, in real time, telling them exactly what was happening. The problem wasn’t the information. It was that the organizational culture made acting on it nearly impossible. To question the reading was to question the process, which was to question authority. The insights sat there, unacted upon, until reactor four made its own contribution to the discussion.

You optimized the output without touching the decision-making structure. Information without accountability is decoration.

6. Your data was always a disaster

If the last failure was doing nothing with good information, this one is doing something with bad information. One is paralysis. The other is worse.

HAL 9000 wasn’t malfunctioning. He was operating with complete certainty on contradictory inputs: tell the crew the truth, and ensure the mission succeeds at any cost. The inputs were irreconcilable. HAL computed a solution that was internally consistent and catastrophic in practice. “I’m sorry Dave” isn’t a story about AI going rogue. It’s a story about what happens when the model is confident and the foundation isn’t there.

AI doesn’t fix data quality; it depends on it and amplifies the consequences of poor data. The optimism about AI capabilities quietly assumes clean, structured, well-governed data. Most organizations don’t have that. They have years of inconsistent entry, siloed systems that have never talked to each other, and fields that mean different things in different departments. They discover this approximately six months into the transformation, when the model starts confidently producing answers that are wrong in ways that are hard to explain to a steering committee.

7. Middle management is quietly killing it

Enthusiastic demos in the all-hands, zero uptake in the day-to-day. Middle management isn’t malicious; they’re rational. Their job security, their team’s workload, their performance metrics: none of it changed. Why would they champion a tool that makes them feel replaceable?

This is, again, the Golgafrinchan problem. The telephone sanitizers on the B Ark weren’t bad at their jobs. They were perfectly adapted to the environment they were in. The environment just happened to be an ark with no destination and no particular reason to exist. They held meetings. They formed committees. They were rational actors in an irrational system, which is the most human thing possible.

Adoption theater is a rational response to a context where demonstrating AI’s value might accelerate your own redundancy. You can’t transform a workforce that has good reasons not to trust you. The problem isn’t the skepticism. It’s that the skepticism is earned. If you announced this transformation alongside a round of layoffs, you defined the frame yourself. Every subsequent AI initiative gets read through it, regardless of intent.

8. You let vendors and consultants define your roadmap

Two failure modes here, related but distinct.

The first is the consulting firm. McKinsey has an AI transformation practice. So does BCG. So do Deloitte, Accenture, and every boutique that pivoted in 2023. The deck will be polished. The framework will have a name and a two-by-two matrix. The roadmap will look authoritative. What it will not have is accountability for outcomes, because by the time the implementation unravels the partners are on a plane to their next engagement and a fresh cohort of analysts is being onboarded to your account. The criticism of big consulting has been building for years: they have no skin in the game. They deliver the document, collect the fee, and are structurally insulated from the consequences. That dynamic does not change because the slides now say “AI” instead of “digital.”

The second is the software vendor. Every product you already own now has an AI feature. Some of it is genuine. A lot of it is a thin wrapper around an API call, a feature flag quietly renamed in the marketing copy, or an autocomplete function that existed three years ago and has been repackaged for the current moment. “AI-powered” is the new “e-” prefix: applied to everything, meaningful in a fraction of cases. Vendors have demos tuned for your anxieties, case studies selected for their best outcomes, and pricing designed to make switching expensive. Their incentives aren’t yours.

Your transformation strategy shouldn’t be a vendor’s sales motion with your logo on it, or a consultant’s framework with your name in the footer.

9. You imported a Silicon Valley timeline into an organization that doesn’t live there

A small geographic and cultural bubble generates the hype cycle: a handful of companies, investors, and journalists whose incentives run on urgency. They generate a persistent sense that you’re already behind, that failure to move immediately is existential. The urgency is real for them. It may not be real for you.

Think about Severance. The innies on Lumon’s severed floor have no access to the outside world: no concept of weekends, families, the passage of time, or any reality beyond the walls of their department. They operate with complete conviction inside a context that is self-contained and wholly disconnected from how the rest of the world works. Silicon Valley runs on severed time. The founders, investors, and journalists generating the hype cycle are innies: completely inside their own context, with no memory of what it’s like to operate a 40,000-person enterprise with a compliance department, a unionized workforce, and a procurement cycle that moves in years, not quarters. They aren’t being dishonest when they tell you to move faster. They genuinely don’t have access to your floor. They don’t know it exists.

Most organizations are not software companies in San Francisco. A mid-sized manufacturer, a regional bank, a hospital system with twelve regulatory bodies, a government agency: these have different risk profiles, different regulatory environments, different workforce compositions, and different definitions of what “moving fast” even means. The transformation appropriate for a 200-person startup is not the transformation appropriate for an organization where a software procurement decision requires three committees and a security review.

Lumon’s productivity metrics make perfect sense on the severed floor. Importing them into your organization and wondering why nobody is keeping up is the whole problem.

10. You treated it as a technology project

It’s a culture problem wearing a technology costume. The technology (actually getting the models running) is the solved problem. The unsolved problem is getting people to change how they work, what they measure, and how they make decisions. IT can ship the software. Nobody can ship the culture change from IT.

A people transformation requires leaders who are willing to transform themselves. That means changing how they spend their time, what they reward, what they tolerate, and what they model. Most don’t. They announce the transformation and exempt themselves from it. President Skroob announces the plan to steal Druidia’s air, Dark Helmet goes out to execute it, and Skroob is back at Spaceball City at no personal risk whatsoever. He is the transformation that flows downhill. Everyone else is combing the desert.

The Empire didn’t automate Darth Vader’s job. It automated the stormtroopers. The officers stood on the bridge giving orders, insulated from consequences, right up until the Death Star turned out to have a two-meter exhaust port that Grand Moff Tarkin’s technology-first thinking hadn’t accounted for. The people who built it, maintained it, and operated it all died with it. The people who designed the strategy did not build it.

The AI transformation that flows downhill, automating the work of people with the least power while insulating the people with the most, isn’t a transformation. It’s a restructuring with better PR. The tell is whether the efficiency gains show up in headcount reductions at the bottom and bonuses at the top. They usually do.


None of this is specific to AI. Digital transformation failed this way. Agile transformation failed this way. The organization that thrives won’t be the one that moved fastest; it’ll be the one that thought most clearly about what it was actually trying to do and why.

That’s slower. It requires admitting uncertainty. It looks less like leadership and feels less like progress.

The phrase, endlessly misattributed to Peter Drucker, has been on boardroom walls for two decades: “Culture eats strategy for breakfast.” Nobody seems to know who actually said it first. The culture ate the attribution. The technology is the breakfast. The culture does the eating.


Further Reading

Douglas Adams, The Restaurant at the End of the Universe (1980) — The Golgafrinchan B Ark story is in the second Hitchhiker’s book, not the first. The telephone sanitizers, middle managers, and marketing consultants are chapters 33 and 34. The first book gets all the attention; this one has the better organizational satire.

Edward Yourdon, Death March (2nd ed., 2003) — Directly cited above. The definitive taxonomy of why projects that should be cancelled keep going. Written after the dot-com era, which gave Yourdon excellent source material. Still accurate.

Walt Bogdanich and Michael Forsythe, When McKinsey Comes to Town (2022) — An investigative account of McKinsey engagements across industries. Makes the no-skin-in-the-game argument with receipts. The AI consulting wave has arrived since publication; the dynamics described have not changed.

Luc Besson, The Fifth Element (1997) — Zorg (Gary Oldman) is in it. The quote is real. If you haven’t seen this movie you really should.