The Boundary Is Moving: AI Changes What We Expect from Team Leaders, Managers, and Executives
About a year and a half ago at O.C. Tanner, I was doing a shop-floor visit when a night shift team leader showed me something that changed how I think about AI and manufacturing. He'd taught his iPhone to spot defects in aluminum finish.
Paint defects were a recurring problem, issues regularly made it past team members. We'd discussed vision systems, but always dismissed them as too expensive or requiring specialized integration support.
This team leader had been in the role for maybe six months and was relatively new to the company. He had moved through the ranks quickly, but was still learning the operation. But in weeks — not months, not years — he'd built a proof of concept for technology I'd written off as not applicable. He demonstrated what might take experienced operators 12 months of observations to develop. Technology held the complexity that used to live in human expertise.
Since then the tools have only gotten better. The boundary is moving. Not someday. Now. By testing similar boundaries over a decade — the implications for organizational structure are more profound than most people expect.
At O.C. Tanner, an employee recognition software and custom awards company, we spent a decade systematically rotating manufacturing leaders across operations. Distribution to assembly to CNC milling to casting. Deliberately crossing boundaries most manufacturers treat as uncrossable.
The goals: Develop leaders who could improve any operation, not just the one they'd spent 20 years mastering. Build lean capability that transferred across contexts.
O.C. Tanner gave us a natural laboratory for this. We ran one of the most diverse manufacturing operations you'll find in one company: specialized CNC mills, electrocoating with precise chemistry control, metal alloy refining, cast trophy production with complex heat-treat requirements, high-volume distribution, laser engraving, and complex personalization. Simple to complex. Standard to custom. Forgiving to unforgiving.
Here's what we learned about leader rotations:
The rule, however, didn't apply everywhere. In distribution and simpler manufacturing (fast feedback loops, forgiving errors, well-documented standards) rotations worked beautifully. In die and tooling, casting, and e-coat (slow feedback loops, catastrophic error consequences, high tacit knowledge) rotations were much harder. Three variables explained the difference:
We had advantages most manufacturers don't: 15-year average tenure meant the team could carry a new manager through the learning curve. A psychologically safe culture meant leaders could say, "I don't know this yet, teach me" without losing credibility.
When rotations broke — when humility wasn't present, when a manager tried to fake expertise — team morale suffered. Quality slipped. The rotation became a tax everyone else paid.
These three variables showed us where complexity lived.
During the last five years, we watched the complexity barrier drop. We used Confluence (a collaborative workspace for teams to organize information) to build better knowledge access for team members and team leaders. We invested in RPA (robotic process automation) to offload complexity. The rotation experiment showed us where we were lacking, and we systematically addressed gaps. Technology started holding more of what used to require human memory and expertise.
And now, with generative AI, RAG databases, and plain-language translators, I'm watching that barrier drop faster than I ever expected. Remember those three variables that predicted rotation success? AI shifts the equation:
The zones don't disappear. Casting is still harder than distribution. But the boundary moves. Operations that required five years of technical depth might now work at three years. Tasks that needed 18 months of tacit knowledge and recall might work at six months with AI augmentation.
This is the pattern: AI isn't replacing expertise — it's lowering the threshold of complexity where expertise becomes irreplaceable.
When technology can hold complexity, the question becomes: what do we need humans to do?
If AI lowers complexity barriers, the implications ripple through every level of the organization.
Team leaders become the linchpin.
In most organizations, the team leader is often a first responder firefighter, a line worker wearing an extra hat, or an admin clerk. That's if the organization has invested in the role at all. In a world where organizations that evolve more quickly survive, this has to change.
A team leader's ability to observe work firsthand, spot emerging practices, and socialize them across the team becomes the core driver of performance. The focus becomes studying and improving the work itself.
When AI enables rapid localized change (individual team members building their own workflows, solving problems in their own way) it becomes imperative to have someone studying and socializing those changes. The work increases: more variation to track, more methods to evaluate, more decisions about what to standardize. Their role becomes critical: observing work firsthand, ensuring people use the best methods, socializing standards so team performance improves and methods don't diverge.
Middle management shifts to cross-functional facilitation.
Middle management loses much of its historic role as summarizer and interpreter of technical knowledge. AI can aggregate information, spot patterns across data, and generate technical summaries.
What AI can't do, however, is navigate the social systems that enable change, build trust across functions, or accelerate the adoption of effective practices across value streams.
Middle management's value shifts entirely to cross-functional facilitation. They become the people who can look end-to-end across value streams, not domain protectors who aggregate reports upward. They work on relationship management, social system stability, and helping practices transfer from one area to another.
At O.C. Tanner, we'd intentionally added layers of middle management (not to aggregate reports, but to facilitate cross-functional improvement work). After 26 years in a lean journey, we knew local optimization wasn't enough. We needed people looking end-to-end across value streams.
AI lowers this barrier significantly. The manager who understands the operation can now generate polished communications without needing a translator; stay connected to the work while meeting documentation requirements; and step into cross-functional facilitation confidently.
Executives get exposed.
Technical support and functional departments shift from guarding specialized knowledge to enabling integrated flow of value. AI democratizes technical knowledge. You don't need the legal expert to tell you what the regulation says, you don't need the compliance specialist to interpret the rule. The technical experts who built their value on "only I know this" lose their moat. What remains is the work AI can't do: applying judgment in ambiguous situations, navigating the social systems that enable change, helping the organization learn.
Executives who sit on top of those technical kingdoms get exposed. The ones who relied on being rulers of specialized domains have nowhere left to hide. Their value has to shift entirely to actual strategic work: setting enterprise-wide direction, building shared purpose, making choices about where to compete and how to win. The executives who relied on "knowing more than you" rather than "seeing farther than you" won't survive.
Silos that were originally created to protect technical expertise come under pressure. Social capability — navigating relationships, building shared purpose, leading change through doing — rises in value.
Governance has to adjust to reward those who drive cross-functional flow, not just those who manage within their domains.
Here's where it gets harder: in knowledge work, divergence is invisible.
In the manufacturing shop floor, when you leave an operation and come back three weeks later, you can see what's drifted just by observing. Without the proper leadership routines, things shift. Team members come up with creative "hacks" to keep the line running, adjust equipment to cover the slop that gets passed to them, new tools and techniques emerge. And you can see it.
In knowledge work, you can't. Just this morning, someone asked where our video assets were stored. We discovered seven different storage locations. Files had migrated over the years — partly based on where content was created, partly based on people's preferences, partly due to simple drift.
Ask a team what tools they use to complete the same editing function, and you'll hear: Grammarly, ChatGPT, Claude, a human editor, or combinations of these things in different order. All different. All invisible unless you ask.
On the sales side, the techniques people use to capture insights from calls are all over the board. Different tools, different methods, different levels of detail. No one knows until you observe the actual work firsthand, and even then you might have to ask about connections and thinking.
Here's the problem: in manufacturing, I can walk the floor and spot tool modifications. In knowledge work, I'd have to leverage paired working to extract how they're actually working. I'd need to dig into the prompts they're using, the instructions they've saved, the workflow automations running in the background. The work methods are hidden.
And with AI, this divergence accelerates. Everyone now has access to powerful tools that let them build their own workflows, create custom solutions, develop personal systems. The speed at which methods can diverge — and the sophistication of those divergent methods — is unprecedented.
Tyson Heaton leads Lean Tech at LEI, exploring the intersection of lean thinking and technology transformation. Previously VP of Manufacturing at O.C. Tanner.
View Profile →Get articles like this delivered to your inbox. Join technology leaders learning to apply lean thinking.