From the mid-’90’s through the mid-’00’s, I worked as what was then called an eLearning and knowledge management consultant for what was then called training and development departments. It was an interesting time in the sense that computer-based training was just making the transition to the web and systems to author and deliver that training at scale were starting to come onto the market. That work enabled me to transition from classroom teaching to EdTech. I learned a lot of craft. I experimented with designing interconnected systems of “reusable learning objects” back when that term first started gaining currency. I designed a learning content management system back before that term existed. It was interesting.
For a while.
The truth was that, while we were learning a lot about using new tools to scale up training, the output was relatively low-value to the organizations (as evidenced in part by the fact that training and development was always the first department to get hit when budget cuts came).
There were several problems. First, we weren’t good at developing truly flexible and effective training, or at matching the training to the changing needs. Out-of-the-box content usually didn’t match the specific needs of a particular organization while reusable learning objects didn’t turn out to be terribly reusable in practice. In 2002—a couple of years before I started e-Literate—I published an article called “How to Design Recyclable Learning Objects.” It still holds up today. The whole approach of assembling content lego blocks is fatally flawed. While it’s not useless, it often fails to yield the value required to justify its investment.
Second, very often providing somebody with training wasn’t the right solution. Maybe they’d learn the right thing but would forget it by the time they needed it. Maybe they were learning to work around a broken process or broken software when the real fix was to change the process or software. And the training content was not keeping up with the changing business processes and knowledge needs within the organization.
So consultants like me invented various adjacent approaches. Knowledge Management. Electronic Performance Support. Performance Analysis. Business Process Analysis. These methodologies were often pretty labor-intensive and were not widely taken up by businesses.
I left the corporate world partly because I felt it was not evolving and I couldn’t make an impact until it started to change. More recently, with all the buzz and investment in what is now often called “workplace learning,” I’ve been curious. What’s changed? Has this space gotten interesting again? Luckily for me, I subscribe to GSV N2K, which is one of very few daily link newsletters that I still follow. (My rule of thumb is that if I don’t click on an average of at least two links a day, then I unsubscribe.) So I’ve been taking a peek at how much this space has evolved in the last two decades.
While I can’t claim to have a good grasp of the space at this point, my initial impression is that it sadly hasn’t evolved nearly as much as I had hoped.
Industry needs to learn to listen to industry
There’s a mantra these days that higher education needs to get better at listening to industry so they can better prepare students for work. And while there is definitely some truth to that, it assumes that “industry” knows what it needs its workers to know. Former HP CEO Lew Platt once famously said, “If only Hewlett Packard knew what Hewlett Packard knows, we’d be three times more productive.”
In other words, a lot of vital know-how is locked up in pockets within the organization. It doesn’t reach either the training folks or the HR folks. So how are either universities or EdTech professional development companies supposed to serve an invisible need?
A recent Forbes article entitled “Going From Learning Provision To Performance In L&D” suggests that “industry” isn’t much closer to solving this problem now that it was when I left.
Here’s the first key passage:
The problem is that many L&D leaders are choosing a slightly different approach to what’s always done. They’re looking to vendors with their silver-bullet solutions, and only now they’re searching for “smart” ways of matching generic content to the most commonly required skills across the entire workforce. On the surface, this might seem to make sense. With millions of items of learning content and only 50,000 employees, the algorithm will find something for everyone, right? That’s what we’ve bet the house on.
Shouldn’t the same generic content sold to so many different organizations still work in your unique culture? I’m sorry to say it likely won’t since all organizational expectations are not the same. The lack of engagement from employees is a clue because they know it too. It’s not that they don’t know how to learn or they don’t like to learn online. It’s because their experience tells them that their valuable time spent “learning” might not equate to actual skills development.
Going From Learning Provision To Performance In L&D
This is the “Netflix of education” canard or, if you’re slightly older, the “smart playlist” fantasy.
AI doesn’t make learning objects any more re-usable because, as I illustrated in that 2002 article, the key missing ingredient is context. One lesson on the same topic may not work in two different workplaces because the examples are wrong. Or because the content needs to be connected to the previous and subsequent bits of content in ways that help the learner make sense of what they’re being taught. As the Forbes article puts it,
The crux of this is that L&D’s preoccupation with more and better “learning” is a dead end if it doesn’t fully incorporate the work context and what employees are trying to do. The development of a learning provision cannot predictably and reliably affect performance if it’s off-the-shelf or only loosely tailored to an organization’s culture. When put in the context of the skills gap, if solutions (programs or content) don’t reflect how the work is done in your organization then they can’t positively affect capability and performance.
Going From Learning Provision To Performance In L&D
So what’s the answer? Needs analysis? Nope:
One of the stock-in-trade tools in L&D has been the learning needs analysis. This has helped L&D find out what training (or learning) is needed to further bolster the learning provision and determine how we spend our budgets. But it rarely relates to actual jobs: the very tasks, interactions, roles, expectations and outcomes employees are measured against. The learning needs analysis helps aggregate common needs so we can develop or purchase standardized programs and content that covers all bases without doing much analysis of the work context. The logic is: If we add more courses and buy big enough suites of generic content then there must be something for everybody, right? Surely? Except this doesn’t stand up to any level of scrutiny.
Doing the course or completing the content does not mean reskilling. We know this, but we often ignore it. The big learning content vendors can give us the credits and produce certificates, but completion doesn’t mean competence. Not even close. It just means you were there.
Going From Learning Provision To Performance In L&D
This is essentially the same criticism leveled at higher education. The degree, the credential, the certification of completion doesn’t mean much even if the educational experience it represents is built to address a needs assessment conducted by trained professionals inside the company.
A quick skim of this CLO Magazine article suggests that the Forbes author is right about the continuing blind spot. The author, who is the Senior Director of People Growth and Enterprise Skills Strategy at Warner Bros. Discovery, conducted a study of training interventions designed to take input from employees as well as how employees go about using the training resources at their disposal. The main conclusions, from my perspective, were (1) employees are more engaged in training if you ask them what they want, and (2) employees decide what training they need when they need it and will be as likely to go to YouTube as to that library of AI-curated content that the employer paid for.
So what is the author’s answer? Performance analysis:
Performance analysis means seeking the answers to questions that relate to what employees are expected to achieve and that they are not able to easily or effectively. What is actually being observed and what are the implications of things not working the way they should? Recognize those responsible for the work and the expected deliverables. Completely articulate how things should be working. L&D professionals have been too quick to interpret performance needs into learning needs and completely distort reality in order to develop courses. But an exploration of the way the work is done and the results expected will help us move from perpetual anxiety about our value and impact to actually affecting the way the work is done.
Going From Learning Provision To Performance In L&D
This is…the same idea we were advocating for (and failing to get traction on) twenty years ago. It’s hard. It’s labor-intensive. It requires specialized skills. And it constantly needs to be refreshed.
Waiting for the next EdTech crash
Right now, companies that sell libraries of training content are hot. I’m not saying these products are bad. I’m not saying they’re useless. I am saying that there is an element of faddishness contributing to their valuations and ability to raise capital. And I’m saying that the same tired solutions—either buying libraries of whole courses or buying libraries of lesson bits—are not going to become game changers just by sprinkling some AI fairy dust on them. You can’t fix a fundamentally limited approach to constructing effective learning experiences with a matching algorithm.
There are ways to approach these difficult, thorny problems. We’re trying out several of them in my day job. But my main point for the purpose of this article is that there seems to be an assumption that building a workplace learning EdTech company is easier than in higher education. I will grant you that selling to corporations is less painful than selling to universities (though it is still very, very painful). But I think investors are going to discover the fads and crashes in the corporate space are not less dramatic than they have been in higher ed. Just because something is selling well now doesn’t mean that it is solving a real problem and will remain hot.
If you want to develop lasting product/market fit with a learning product, regardless of the sector that it’s in, you’ll need to look deeper than the buying patterns of the moment. This is a complex space whose ongoing success cannot be predicted by purchases, initial usage, or even the first couple of years of repeat purchases. It takes a while to figure out if learning interventions are working, especially if you’re not studying their impact (which most organizations are not).