Sifting through the masses of online content to make good decisions about what to read, listen to or learn next is difficult. Sometimes, next to impossible. A way to make sense of the constant bombardment of content is to curate. At Filtered, we’ve hand curated over 800 assets across multiple platforms; our demo product, our L&D version, and client projects as well.
We use this system of manual curation to build a training set for content intelligence, our technology that automatically sources and tags content according to its business usefulness. We’ve developed our own 6-step process: Understand, Source, Evaluate, Publish, Maintain, Analyse.
1. Understand: You curate for an audience, you don’t curate for yourself. The most important part of content curation is understanding the needs of the audience. This requires time, thought and empathy. Invest time in familiarising yourself with the learning needs and context of your audience. In our demo product, we curated over 500 assets for modern knowledge workers. There are some important skills that are common to all knowledge workers: productivity, writing, email, Excel and the Microsoft Office package, and communication. Here is the base of our user profile, if they need to build these skills and are also short of time and under pressure, what content should we deliver to fill those gaps?
2. Source: Set up a system that works for you. I use email newsletters from organisations like HBR, McKinsey and Company, Brain Pickings and Fast Company. Twitter lists are a useful way to pick up ideas from thought leaders and writers. Pocket App is a great app for storing content and it also has a decent newsletter of excellent curated articles. Google alerts can be useful if the search terms you pick are refined enough.
3. Evaluate: Amalgamating content can be automated but curation is manual; it requires a considered human opinion. Refer to the profile you’ve created and evaluate content against that. Is the quality high enough? Does it support the learning needs of your target audience? If your learners are time-pressed, is the asset the right length? Are they mostly on mobile, is the asset mobile ready?
We use seven criteria to evaluate the quality of a learning asset - each must be:
1. Business-useful (applicable in real business)
2. Visually appealing, persuasive, enjoyable, succinct (good value per word)
5. Independent: no in-content agenda (eg selling a product)
6. Serviceable: available, mobile-friendly, easy to navigate, not full of clickbait/ads
7. As a selection: a good mix of length and format-type
We judge our content against these criteria. A given asset needn’t meet all of them to get the green light, but it will need to meet a few and these criteria are useful priming concepts as we go down the list evaluating them.
4. Publish: Present your content in a way that’s meaningful to users. Promote the really juicy stuff (eg well-chosen TED Talks, for a lot of organisations) to attract your workforce to the learning system you use. Elevate it by giving it meaning through context. Tagging it to a competency will indicate the skill the content will build. We have 40 base competencies which you can learn more about here.
5. Maintain: Your content needs to stay clean and relevant. We have systems that check for broken or migrated links. Each asset is date stamped (where possible) so outdated content can be replaced by newer pieces if the subject area needs it. The fields of AI, cybersecurity and cryptocurrency move quickly so content will look dated fast.
6. Analyse: Find out what is and isn’t working for your audience. Learning Locker from HT2 Labs stores learning data in xAPI format. We use it on our platform to gather and analyse all learning events on the platform, what assets were launched or favourited, what format is most popular, what departments use the tool most. Analysing this content will let you see what gaps you need to fill. What isn’t getting traction. What you need more of. And of course aggregating and providing insights at department or firm level is more powerful still.
And then…repeat. Use your analysis to once again understand the (possibly new) needs of your audience. And begin the process again.
Curation never ends because there will always be a constant stream of new content. But at Filtered we’ve got a system to make some sense of it all.
How do you, personally or at your organisation, manage content overload?