Why L&D Should Stop Building Learning Programs, and Start Building Operating Systems
Enterprise L&D doesn't need another platform. It needs an intelligent layer that connects what we already have. Enter: AI agent ecosystems...
For most of its institutional history, enterprise L&D has operated like a service desk. Someone identifies a skill gap. L&D designs a program or identifies the right content. People sit through it. A survey goes out. The cycle repeats. The alignment of that activity to business strategy and priorities is now table stakes.
We've gotten very good at this. And it's no longer enough.
AI is transforming work faster than we can reskill for it. Skills go stale in months, not years. The workforce is more volatile than it's ever been. None of this demands a better program. The business isn't asking for better training programs. They're asking for something that enables their workforce to shift at the speed of the business and the market. That means infrastructure that doesn't just respond to learning needs but anticipates them.
What I think that infrastructure looks like: not a learning management system, but a learning operating system.
Some quick framing before we go further. By "operating system" I mean an architectural metaphor, not a product you buy. I'm describing how the tools, data, and workflows you already have can connect deeply enough to behave as a unified system, even if no single vendor built it that way. More on what that actually looks like in a moment.
The problem with "Programs"
Programs have a beginning, a middle, and an end. They're built for a moment: a priority skill gap, a product launch, an onboarding cohort. That's not inherently wrong. But the speed at which they can be executed is lacking. They consume enormous resources to produce, deploy slowly, and are often obsolete by the time they reach the learner. In some cases the employee has moved past this long-forgotten gap and has a more pressing need by the time our "program" finally reaches them.
More fundamentally, programs are discrete. They don't talk to each other. They don't talk to the business. They don't talk to the learner's actual work context. They exist in a silo, usually inside an LMS most people avoid until their compliance training hits their transcript.
This isn't a content quality problem. Many L&D teams produce genuinely excellent content. It's an architecture problem. Even when we get behavior change metrics and connect our programs to real results, it feels slower than the business is required to move. The framing that shifted my thinking: what jobs need to get done and could an AI agent do them? Not just do them, but do them at a scale and speed that we as practitioners cannot.
When L&D operates as a ticketing system, it invites the business to treat it that way.
Think in Agents, Not Applications
That simple question, what jobs need to get done, was an unlock for me. When you look at enterprise L&D through that lens, the work breaks down into a surprisingly clear set of discrete repeatable jobs.
Learners need the right content at the right moment, not a catalog with sixty thousand options. Leaders need to see actual behavior change on the job or new skills being applied to their work. We need objective measures of whether learning actually built capability and closed skill gaps. Some of this exists today in dashboards and survey data, but rapidly disconnects from new needs as they arise.
But here's what agents can't do on their own: understand business strategy well enough to know which skills matter most right now. Exercise the judgment and discernment to sift through the noise. L&D needs to point the system in the right direction. If everyone wants to learn basket weaving, someone needs to tell the agents this is not a business priority.
An agent-first ecosystem doesn't make L&D practitioners irrelevant, it elevates them.
The L&D practitioners who will matter most in the next five years are the ones who can look at those jobs and translate them into agent design. Leaders who understand both what good learning looks like and how technology can help us do it better at scale. I'm not suggesting every L&D leader become an AI engineer, but understanding what these tools can do and how they can 5x our impact will be crucial.
Learning can finally be part of work, not in addition to.
A quick note on where this sits in the broader AI transforming L&D functions conversation. Marc Ramos recently published an important and extensive piece on the L&D Operating Model for the Agentic Era. It's a framework for redesigning the function across five layers: capability, process, governance, resource, and interface. If you haven't read it, start there, it's the most comprehensive treatment of how the function itself needs to change. What I'm describing in this piece sits alongside that framework. If the operating model is how you design the function, the agent ecosystem I'm describing is what the technical and intelligence capability inside that model actually looks like when you build it and by extension helps connect the dots beyond Talent Development to other People functions. Both layers matter. This article is about the second one.
What This Actually Looks Like in Practice
This operating system is not what delivers experiences, it makes them possible. It's invisible infrastructure. People are already choosing YouTube and TikTok over corporate content libraries, and honestly, can you blame them? Corporate L&D needs to raise its game, delivering things seamlessly to help employees build skills without them even knowing.
In practice, this means three things:
This proposed shift is not without challenges. If you've ever tried to get two systems to share data cleanly inside a Fortune 500, you know the gap between this vision and today's reality. That said, the architecture is now possible in a way it wasn't two years ago, and the organizations that start building toward it now will have a significant head start.
Why Modality Matters
I want to challenge the idea that micro learning is the future of enterprise learning. Don't get me wrong, I firmly believe micro content has many advantages and a prominent place in enterprise L&D. It's fast to produce (or even instant via AI). It's easy to deliver in chat and can be text or video based. But if we're honest, that's a narrow slice of how people actually learn.
A new manager delivering their first difficult performance conversation doesn't need a quick nudge. They need to simulate that in a safe place to get the right tone, pacing and confidence. That might mean an AI generated roleplay with real-time coaching. It might mean a cohort based experience where real learning happens between humans facilitated by a real human being.
The content agent in an agent ecosystem needs to be modality-intelligent. It should be able to assess the learning need and match the optimal format. Yes, sometimes that will be quick hitting video or text based micro content. In other cases that will be longer form programs with thoughtful instructional design. It's not about one modality being better than another, it's having an ecosystem that can balance instant delivery with the kind of deep practice that actually changes behavior.
Recommended by LinkedIn
What About The Humans?
If agents are doing the sensing, drafting, delivering and the measuring, what exactly are the humans doing? Everything that matters.
While rooted in strategic need, everything we're asking the agents to do is largely administrative. Collecting needs. Drafting and administering content. Reporting. It's all at the task level tactical. Another advantage? Agents scale better than people and they don't get tired. They don't have time zone dependencies. But, an agent can't tell the nuanced difference between a request for AI upskilling and broken business processes that won't be fixed by AI alone.
That's our job. It's always been our job. The difference is that in an agent-first ecosystem, we stop spending 80% of our time on logistics, content production, and reporting and start spending it on the work that requires a human brain. We understand the context. We understand the business strategy. We can ask more probing follow up questions and ultimately exercise some judgment on what matters and what is just noise.
An agent ecosystem doesn't eliminate L&D roles, it eliminates the parts of L&D roles that have kept us from doing the real work. Practitioners who thrive will be the ones who step into the space that opens up. Strategists. Curators of organizational intelligence. Partners who help business leaders unlock capability.
The Build / Buy / Bot Question
As with most big questions, the honest answer is: it depends. It depends on your resources, your timeline and who you can find to build with.
The first path is to build. Most large organizations already have significant AI infrastructure in place, Microsoft 365 tooling, Copilot licenses, cloud AI capacity. One approach is to extend that existing infrastructure. Tap into the existing scale and build an orchestration layer yourself. Tools like Claude, Lovable, Replit etc. can rapidly get you to a prototype of what's been described in this article. Doing it yourself can also help you avoid being locked into one vendor's vision.
The second path is to buy. The technical "plumbing" required to connect all these systems in a real way is complex and requires significant technical resources. If there's a vendor out there whose roadmap genuinely aligns with where you're headed, and is willing to co-create your envisioned future state, buying may actually get you there faster. Calling this "buying" undersells what it actually is. There's no vendor platform that can deliver this seamlessly off the shelf. The real version of this path is a genuine co-creation partnership. The vendor space is catching up fast. Josh Bersin's recent research on dynamic enablement documents how AI-native platforms are now offering capabilities: from autonomous content generation to digital twins of subject matter experts.
The third path is to "bot" a solution. Low/No-code helpers that can get you closer to this vision faster than a full blown agent ecosystem. This can also be a helpful starting place as you begin to conceptualize what an autonomous agent would do in these spaces. Bots, when directed with purpose, can do some of this sensing or creating for you. Think about giving Copilot a set of instructions for how you want storyboards created, then prompting it with your learning objectives and subject matter. The bot isn't the end state, but it's a legitimate first step.
It's less of a "build or buy" debate, it's where does the intelligence need to live, and how fast do we need to move? For some organizations, building an orchestration layer and buying what's available will be most effective. For others, a bot first starting point that matures into something more is the right pace. For still others, a genuine co-creation partnership is the only way there. This is not a one-time platform decision, it's about what can evolve as the technology continues to advance. The rate of change will only increase.
The north star has to be building organizational capability to evaluate, integrate and iterate continuously, all in service of employee learning and development.
What This Requires of L&D Leaders
Regardless of how you get there, this shift will require something uncomfortable: L&D leaders need to start thinking like product managers and system architects, not just learning designers. By extension, much of our workforce also needs to begin thinking like product managers, but that's a topic for another day.
What this means is understanding API integrations and platform ecosystems, not just ADDIE and Kirkpatrick's model. It means making business cases in the language of workforce ROI and skills coverage, not learning hours. It also means building deeper relationships with our IT and HR Technology as partners, not approvers.
We need to come to terms with where our function has been complicit in our own marginalization. Our ability to respond to asks from the business has been admirable, and has perpetuated that idea of a ticket fielding function. We earn strategic relevance by being ahead of the need, not behind it, and delivering outcomes the business can actually feel.
Beyond L&D - Talent Intelligence Ecosystems
Everything discussed to this point has been about how an agent ecosystem could transform L&D. But here's the real thesis and where I think this conversation needs to go next: L&D is just the starting point. The agent ecosystem architecture I've been describing doesn't stop at the edge of the learning function. Where this model becomes genuinely transformative is when you extend the principle and agents across the entire talent lifecycle.
Open the aperture. Imagine a workforce planning agent that continuously monitors headcount, attrition patterns and business forecasts. That agent identifies a critical capability, say, PyTorch, TensorFlow and MCP server building. This capability gap will constrain IT's ability to deliver on the organization's AI roadmap over the next 18 months. Instead of hearing about it when we have a talent crisis, the workforce agent works with the needs analysis agent to get in front of the challenge now, delivering targeted development to those best positioned to close the capability gap.
Now layer in employee listening. An always-on listening agent doesn't rely on quarterly or annual survey deployments. It looks at manager pulse checks, learning program results and engagement patterns to identify dips or peaks in real time. It connects to learner needs data to surface where employees feel under-supported. It detects declining engagement in a team, cross references with the skills agent and notices an unaddressed capability gap. The content and delivery agents address the gap automatically. The organization is now responding to employee experience in real time.
One more example: performance management agents. They see objectives set in the HCM and cross-reference them against available skill data and needs analysis trends. The needs analysis agent flags capability gaps and the content and delivery agents proactively serve up development resources. This feeds back to the manager via the analytics and feedback agent so they can ground their performance conversation in real data, not recollection. The power is in the connections: every agent making every other agent smarter. Talent acquisition, talent development, performance management, succession, workforce planning, they're all sharing data and insights with each other continuously unlocking workforce capability at a scale and speed that siloed functions never could.
This is how L&D earns strategic relevance that sticks. Not by delivering better programs, but by being the function that generates the intelligence the business needs to inform its talent strategy.
The Opportunity in Front of Us
I've never been more optimistic about the potential of this function than I am right now. Not because AI makes our jobs easier. It genuinely complicates them. But because it makes the right version of our jobs finally achievable at scale.
The L&D leaders who move now to build intelligent infrastructure, who begin thinking in agents and ecosystems rather than platforms, libraries and programs, will define what this function looks like for the next decade. The ones who don't will be delivering marginally better content in slicker platforms.
The choice feels obvious to me. But I'm curious whether it feels obvious to you, or maybe the legacy constraints of the function make it feel impossible.
Here's what I genuinely want to know: if you lead an L&D function, what's the single biggest barrier between where you are now and building something like this? Is it technology? Budget? Executive buy-in? Something else entirely? I have some theories but would love to hear yours.
Resonates well - particularly the point about learning and talent ecosystems connecting. For far too long the separation has just reflected Conway's law: residing in separate leaders and thus separate disconnected ecosystems.
Hi Paul Tiesler! I enjoyed the read. This sounds a lot like what Perceptyx is doing today. We've got some big things coming at our conference in Chicago next month. The technology exists today to provide contextually aware, autonomous, adaptive, hyper-personalized learning and behavior change that produces a specific business outcome. It's a really cool time to be in the people space.
For those who've engaged so far, a question: what's the single biggest barrier between where your L&D function is now and building something like this? Technology, budget, executive buy-in, something else? Curious what you're seeing.
Excellent work Paul Tiesler. Appreciate the reference.
Paul, great article! My comments wouldn’t fit in this space, so I posted them on my profile along with a reposting of your piece.