mail

Weekly Insights

Join 30,000+ practitioners. No spam.

Background

The Reengineering Is Coming

Leadership Negligence, Not Technology Failure, Sets the Stage for AI's Most Expensive Correction

In 1987, Nobel laureate Robert Solow made an observation that still echoes: "You can see the computer age everywhere but in the productivity statistics."1

Tyson Heaton
Tyson Heaton

Leadership Negligence, Not Technology Failure, Sets the Stage for AI's Most Expensive Correction

In 1987, Nobel laureate Robert Solow made an observation that still echoes: "You can see the computer age everywhere but in the productivity statistics."1

Companies had spent a decade pouring money into personal computers. Every desk had one. Every department had a budget line. And the returns? Nowhere.

Thirty-seven years later, MIT's Project NANDA delivered a finding that should sound familiar: 95% of enterprise generative AI pilots have produced zero measurable impact on the P&L.2 RAND Corporation puts the broader AI project failure rate at 80.3%.3

It is a pattern. And if history holds, it will echo again.

Three Waves, One Pattern

three-waves-1-1

Stanford economist Paul David traced this problem back to the 1880s. When electric dynamos arrived in factories, owners ripped out the steam engine and dropped in the new motor. Same floor, same layout, same management. Productivity barely moved. It took 40 years before managers figured out that electricity meant every machine could have its own motor, the factory could be redesigned on a single floor, and production could follow the flow of materials rather than the constraints of a central power source.4 When they finally did, manufacturing productivity surged. The technology had been available for decades.

The personal computer repeated the cycle. Companies automated their existing paper processes, digitized their filing cabinets, put their old forms on screens. Then came enterprise resource planning (ERP) systems, promising integration and productivity. More often than not, ERP encoded the existing dysfunction into expensive software: the handoffs, approval chains, departmental silos, information hoarding. Relatively little changed. The technology made the existing mess run faster.

By 1990, the frustration was acute enough that Michael Hammer wrote his famous Harvard Business Review article: "Reengineering Work: Don't Automate, Obliterate."5 Companies had spent years using technology to "mechanize old ways of doing business." He called it "paving the cow paths."

Hammer's solution was business process reengineering: radical redesign from the ground up. Ford's accounts payable went from 500 people to 125 by eliminating the invoice entirely. Mutual Benefit Life collapsed a 30-step, 19-person, five-department insurance application process into a single case manager with expert systems.6

The idea caught fire. By 1994, 69% of Fortune 500 companies had a reengineering initiative underway.7 By 1995, BPR had become a $51 billion consulting industry.8 And then it crashed. Studies established that 70% or more of reengineering efforts failed or made things worse.9 In 1993 alone, large US firms announced roughly 600,000 layoffs, 25% more than the year prior. Eighteen of the 25 biggest downsizings were at companies actively pursuing reengineering.

Starting in 1995, the three founders of BPR, Hammer, James Champy, and Thomas Davenport, all issued apologies. Hammer admitted to The Wall Street Journal that he "wasn't smart enough" and had been "insufficiently appreciative of the human dimension."10 They had, in their own words, forgotten about people.

The Real Diagnosis

The standard telling of this story focuses on technology lags. "It takes time for organizations to learn how to use new tools." That is true as far as it goes. But it lets leadership off the hook. It implies that the gap between technology investment and productivity is a natural phase, like winter before spring. Just wait it out.

Look closer, though, and the lag is not evenly distributed. Some organizations appear to move slowly with new technology, but unless you can see inside their operations, you won't know the difference between restraint and neglect until the results come in. The ones applying technology effectively and selectively end up further ahead precisely because they didn't rush to automate what they didn't understand. The organizations that struggle for years to extract value are not victims of a timing problem. They are organizations where leadership has lost touch with the work and, critically, with the people doing the work.

Some organizations appear to move slowly with new technology, but unless you can see inside their operations, you won't know the difference between restraint and neglect until the results come in.

Think about what Hammer found at Ford. More than 500 people doing a job Mazda accomplished with five. Even adjusting for scale, Ford was five times the size it should have been.11 [How does that happen? Not overnight. That is the accumulation of years of nobody in leadership looking at the work, understanding why 14 data items needed to match across three documents before a payment could issue, or talking to the people who spent their days chasing mismatches they knew were preventable.

Hammer had a devastating sidebar in that article: "Why Did We Design Inefficient Processes?" His answer: we didn't. The founder delegated a chore to Smith. Smith improvised. The business grew. Smith hired his clan. They all improvised. The hodgepodge was passed from one generation to the next. "We have institutionalized the ad hoc and enshrined the temporary."12

This is what leadership negligence looks like. Not dramatic. Not a single decision. The slow abdication of responsibility for understanding the work and knowing the people who do it. Direct engagement gets replaced by KPI dashboards, accounting reports, and specialized support departments. Leaders manage from the scoreboard and lose sight of the field. Then, when things get bad enough, they call in consultants who arrive as the proverbial whipping boy, there to take the blame for cuts and trauma that years of neglect made inevitable.

When organizations drift this far from their own operations, no technology saves them. It just makes the dysfunction more expensive. And when the correction finally comes, it lands on the people who were never the problem.

But it didn't have to be this way. Hammer was rigorous about understanding processes end-to-end. But he was a computer scientist and an IT consultant, and his solution reflected it: radical redesign enabled by technology, carried out by expert teams. He pointed to Japanese competitors as proof that dramatically better performance was possible, but he never examined deeply what made them better. In 1990, the same year as his article, researchers at his own institution published the answer. MIT's International Motor Vehicle Program released The Machine That Changed the World,13 documenting Toyota's system in detail: an organization developing its people as problem solvers, its leaders as cross-functional integrators who understood the end-to-end flow of value, and orienting everything toward the customer. Toyota didn't need BPR. It was the counterexample hiding in plain sight.

The Human Cost of Deferred Improvement 

The Human Cost of Deferred Improvement

Here is what BPR revealed, and what the AI correction will reveal again: the people closest to the work know the nature of the work. They know how it flows through their hands, what slows it down, what breaks, and what they spend their time on. What they may not know is which of those steps add value and which don't, because nobody has helped them see the end-to-end picture nor how it ties to top-level objectives.

That is leadership's job. Not to do the work, but to create the conditions where the people doing the work can see how it connects to customer value, can distinguish between what matters and what is legacy friction, and can participate in improving it. When leadership abdicates that responsibility, and nobody is helping the workforce see the whole, the knowledge of how the work operates stays locked inside the heads of individuals, still invisible to the people making decisions and disconnected from the value it is supposed to create.

Here is what BPR revealed, and what the AI correction will reveal again: the people closest to the work know the nature of the work.

When the reckoning finally arrives, it arrives as surgery: outsiders mapping processes that insiders could have described years ago; then recommending cuts that could have been avoided through years of steady improvement. The 600,000 layoffs in 1993 were not an inevitable consequence of technology or competition. They were the accumulated cost of leadership failing to improve continuously, failing to involve the workforce in that improvement, and failing to grow the value of what the organization delivers.

This is the part that gets lost in every retelling. Organizations that improve incrementally don't just avoid mass layoffs. They grow. When leadership does the work to challenge and improve the system, something shifts. The capacity and energy that people used to spend responding to instability, cleaning up after messy processes, compensating for broken handoffs, gets redirected toward delivering more value to the customer. But it's not just the efficiency that changes. The nature of the work itself changes. People who used to chase mismatches and file unused reports can be developed into people who solve real problems, who innovate, who create value that didn't exist before.

As the organization becomes more responsive to customer value, something else happens: the cost of distributed innovation drops. Teams can experiment, integrate, and iterate faster because the underlying system supports it. This is how large organizations create the preconditions to innovate at scale. Lean practice doesn't just protect margin; it builds the agility and adaptability that make innovation possible across the entire delivery system.

BPR inverted this. It treated people as cost to be eliminated rather than capability to be developed. It's no accident that 70% of those efforts failed. You cannot redesign work effectively when you've excluded the people who understand it. And you cannot sustain any redesign when the people left standing view the organization as something that discards them the moment a consultant arrives with a process map.

Current State

Consulting firms have poured over $10 billion into AI initiatives since 2023. BCG went from zero to $2.7 billion in AI revenue in two years. McKinsey deployed its internal AI tool to over 7,000 consultants.14 Every major firm has an AI transformation practice they'd love to sell you.

Meanwhile, 42% of companies missed their revenue targets in 2025, up from 32% in 2024. Sixty percent of firms lack the data foundation to scale AI. And 91% of executives still believe they'll hit their 2026 targets.15

McKinsey's own 2025 data contains a buried finding: the companies seeing real impact from AI are three times more likely to have fundamentally redesigned workflows.16

Prediction

Within two to four years, a critical mass of organizations will acknowledge that their AI investments have failed to produce meaningful returns. The executives who presided over this will look for explanations outside themselves. "Everyone was having pilot failures." "The technology wasn't mature." "The vendors overpromised."

Then the consultants will arrive. Not with AI solutions this time. With process reengineering. It will be called "AI-enabled business transformation" or "intelligent process redesign" or something designed to sound new. But the work will be the same: outsiders mapping processes that leadership hasn't examined in years, identifying inefficiencies that accumulated during decades of neglect, recommending dramatic restructuring. And, once again, the people doing the work will bear the cost of leadership's failure to stay connected to it.

McKinsey, Bain, BCG, and the Big Four will have a second heyday. After selling the AI that didn't transform, they will sell the transformation the AI was supposed to enable.

The Divide That Matters

The divide that will define the next decade is not between companies that adopt AI and those that don't. Everyone will adopt AI. It is between organizations that understand their value streams and their people deeply enough to apply AI to real problems, and those that spray AI at processes they haven't examined in years, hoping technology will do what leadership would not.

The first group will use AI to accelerate improvement, surface problems faster, and extend human capability. They won't need dramatic interventions because they never let the gap grow into a chasm. Their people will be partners in the transformation, not casualties of it.

The second group will cycle through pilots, declare premature victories, bury failures, and fall further behind. When the reckoning comes, they will face the same choice companies faced in the 1990s: radical surgery by outsiders on an organization that doesn't understand itself with all the human damage that implies.

What This Means Right Now

If you are a leader reading this, the question is not whether your AI pilots are succeeding. The question is whether your leadership team can walk your end-to-end workflows and explain what is happening. From customer inquiry to sale. From sale to program launch. From order to delivery. From customer request to feature release. Can they tell you where problems are surfacing, where handoffs break down, where dependencies create drag? Do they know what good looks like at the endpoints, or are they managing from the scoreboard?

Lean goes deeper than tools and methods. It is a leadership orientation toward the work, the people, and the relentless pursuit of value. Organizations that hold this orientation don't need periodic revolutions. They evolve, and the people inside them evolve with them.

The reengineering is coming. The ones that see it coming have time to choose a different path. That path requires something harder than buying technology or setting up a new department. It requires leaders who will go to the work, know the people, and take responsibility for improving both.


  1. Robert Solow, “We’d Better Watch Out,” book review of Manufacturing Matters by Stephen S. Cohen and John Zysman, New York Times, July 12, 1987.
  2. Sheryl Estrada, “MIT report: 95% of generative AI pilots at companies are failing,” Fortune, Aug. 18, 2025, about the MIT Project NANDA, The GenAI Divide: State of AI in Business 2025, July 2025.
  3. RAND Corporation, 2025 analysis of AI project outcomes. Compiled with MIT, Gartner, and S&P Global data in Pertama Partners, "AI Project Failure Rate 2026."
  4. Paul A. David, "The Dynamo and the Computer: An Historical Perspective on the Modern Productivity Paradox," American Economic Review 80, no. 2, May 1990.
  5. Michael Hammer, "Reengineering Work: Don't Automate, Obliterate," Harvard Business Review, July-August 1990.
  6. Ibid.
  7. Bala Balachandran, "Business Process Reengineering: Its History, Promises, and Problems," Kellogg School of Management, 1999, about the CSC Index, State of Reengineering Report, 1994.
  8. Art Kleiner, “Revisiting Reengineering," strategy+business, July 1, 2000.
  9. Ibid
  10. Carl Tuna, “Champion of 'Re-Engineering' Saved Companies, Challenged Thinking,” The Wall Street Journal, Sept. 6, 2008.
  11. Michael Hammer, "Reengineering Work: Don't Automate, Obliterate," Harvard Business Review, July-August 1990.
  12. Ibid.
  13. James P. Womack, Daniel T. Jones, and Daniel Roos, The Machine that Changed the World (Rawson Associates, 1990).
  14. Marin Ivezic, "2026 Consulting's AI Revolution Update: Billions Spent, But the Old Pyramid Persists," Consulting and AI, January 25, 2026.
  15. Bain & Company, 2026 B2B Growth Agenda, March 30, 2026.
  16. McKinsey & Company, "The State of AI in 2025: Agents, Innovation, and Transformation," November 5, 2025.

About the Author

Tyson Heaton

Tyson Heaton

Org Strategy

Tyson Heaton spent 15 years running operations in manufacturing at JBS, Schreiber Foods, and Greencore, learning lean on the floor before moving into enterprise technology leadership at O.C. Tanner. The frustration he found there became a throughline: organizations that had mastered continuous improvement on the shop floor were treating their technology layer as a black box, a dependency to manage rather than a capability to master. He now leads LEI's LeanTech/AI initiative, working with organizations done accepting that tradeoff -- the work is moving faster than most can keep up with, and he'd rather be part of sharpening that thinking than watching from the side.

Share this article: