You know how, when you’re going on holiday for 5 days, you pack 10 days worth of socks? Because more of the same thing isn’t going to hurt anyone. Learning’s different.
Instead, imagine each sock takes a reasonable amount of time and effort to put on. And, even though some of the socks are useful, you don’t know which. And anyway you’re really too busy to spend all your time picking out the specific sock you needed. In the end you just wear flip-flops to save yourself putting your foot in it.
To move away from socks - the point is that, if they’re achieving the same thing, two pieces of content is much worse than one. But, most businesses generally have multiple bits of learning where one would do. It creates content overload and puts learners off more than anything else.
So, one of the most important capabilities for an L&D team is judging what content to cut. This is a complicated process that has to be done at scale - you’re unlikely to be spending your entire job umpiring content playoffs.
However, to drill down to the basics of what should make you keep or lose some learning, it’s helpful to imagine that you just have to choose between two pieces of content. If you can get clarity on this, you can expand these fundamentals to the scale of your business’s curation.
You’ll have your own criteria and expertise, tailored to your skills framework and business needs. But, here’s a checklist of some key things to take into account when deciding between two pieces of content:
1 - Three stages of relevance
The first, and most important, deciding factor in which content should stay and which should go is relevance. Even if it’s gloriously insightful, if learning’s not tied to at least 2 of these three relevance factors, it’s not going to have the impact you want.
A - Business relevance
Corporate learning is for corporations. And making learning business-relevant isn’t just excluding musical instruments and juggling. It’s getting rid of anything that’s not actively contributing to your current business goals.
This means keeping an eye on corporate strategy documents, current initiatives, and industry-wide whitepapers to make sure that every bit of content is dovetailing to a singular purpose.
B - Personal relevance
The other side of the coin: if learners can’t see how they’ll be able to use learning, they won’t do it of their own accord. To understand what learners want, we use skills signatures, activity data, employee surveys, and career descriptions.
Of course, you won’t be choosing between two pieces of content for just one person. This is where the crucial second stage of curation, personalisation, comes in.
C - Immediate relevance
Some content is evergreen (or at least long lasting) but sometimes the thing that makes a bit of content better than the rest is its immediate relevance. This might be to a particular launch that’s just happened in your company, a particular innovation in your industry, or even a cultural/social hot topic.
Leveraging this relevance will encourage learners to connect their immediate priorities to what they can get out of the learning.
2 - Value-per-word
Corporate learning time is maddeningly limited. So, if two bits of content are talking about the same subject, it won’t hurt to err on the side of shortness. Of course, if a longer piece is better learners will prefer it. But, we take a lot of time to distinguish between stuff that’s genuinely relevant and content that’s just interesting. And you can do this on a library level too, using a metric such as Cost per Qualifying Asset (CPQA).
3 - Play to your business’ learning style
Our research has shown that certain businesses learn in certain ways. That can be the time of the day they learn, but also what format and medium is preferred.
So, if you can find out through data analysis or simple research how your business prefers to learn, prioritise content that fits that. This also means it’s worth prioritising accessible content. If videos have subtitles and articles/courses are mobile friendly, more people are going to have the opportunity to learn.
4 - Popular sources are a safe bet
This is probably the least fair criterion that might affect your choice. But, in terms of learning content, blockbuster generally beats indie-darling. Not only is it a shorthand for quality for you (which really comes into play at scale) but your learners also trust it.
One caveat though: if you’re going for a really niche topic, there might not be a big, reputable article talking about it in full depth. In that case, it’s worth picking the on-the-point stuff, even if it’s from a smaller name.
5 - Pick the more viral looking one
On the other hand, there’s a reason that fluff articles get so much traction. Articles with buzzfeedy headlines and popping visuals have a power to pull in readership, for better or worse. And if you’re controlling the content, you can afford to tangle with the clickbait.
This doesn't mean we’re recommending putting “We Can Guess Which Season You Were Born In Based On The Snack Mix You Make” on your LXP (I got summer btw). But, if the quality of content is there in both, we’d pick an attention-grabbing article over a dry one.
6 - Pick the path of least resistance
One of the driving forces behind “learning in the flow of work” was how happy learners are to give up. So, if you have a choice, pick the content that makes users do the least work (aside from the actual work of learning).
This is a calculation of clicks and dead time. The more boxes users have to close, ads they have to see, or loading screens to wait through, the less they can be bothered about the end result. And this can have a knock-on effect. They might choose to ignore the next learning insight off the back of the frustrating memories of the last.
7 - Go full Sherlock on the evidence
Sometimes, if you actually trace the links in an online article, they end up going nowhere. There’s no oversight process so often the “evidence” used traces back through a chain of similarly opinionated pieces to nowhere. Or occasionally to Wikipedia.
And if they do flow back to something reputable/academic (sorry Wikipedia), articles with ulterior motives often cherry-pick hastily read studies to justify arguments they’ve already decided to make.
Either way, this is another reason we prioritise well-known publications; they’re accountable if they release light-on-evidence content. Otherwise, it’s worth a thorough search before recommending anything from the wild west of googlable advice.
8 - What’s new?
Learners feel patronised if they’re given content they already know. So, we’ve found, when choosing, to make sure content is new to them. This is a potential issue in two ways:
Firstly, be wary of content overlap. Inconsiderately, learning content rarely matches the nice separation of competencies in your skills framework. So, before recommending content, double check it’s not repeating anything you covered in a different section of your framework.
Secondly, it’s worth taking the time to measure where each learner currently is. If they’re getting recommended stuff they’ve already nailed down, they’re likely to get complacent and assume your learning’s for the less experienced. One solution we use is clustering relevance scores for learning content based on a learner seniority.
9 - Pick the one you could add value to
In a dead heat, don’t forget the part that you, as a learning professional, can play in getting learning learned. It’s going to go through your organisational framework and your learning system to learners that you understand better than anyone else. So don’t underestimate the impact you can have on getting something read/watched/completed.
If there’s a particular way that you can promote a specific piece, or translate its business relevance to your learners, pick that one over something you might otherwise just share.
10 - Aaand repeat…
It’s the last step, but half the battle. Even after all the work and expertise that goes into getting your learners the right content, the only real metric of its success becomes apparent after-the-fact. That’s why we value learning data so highly.
The more insight you can get, the better you’ll be equipped to optimise what’s already there and tailor anything you add. To get the best chance, we analyse as much as we can: usage rates, email click through rates, the points at which users onboard, drop off, and pick back up.
Once you’ve got your principles down, the next step is more complicated. Finding a way to enact and tweak them at scale. This involves two main things: a watertight skills framework and technology capable of getting an almost-human understanding of learning content.
But these, too, are a means to an end. For the majority of businesses, all this effort, technology, and data is coming together to give you one thing: less.