We recently got the good news that the US Patent Office will accept our application for a patent protecting our Filtering algorithm. As it was the first patent we’d applied for, both as a company and as individuals, it’s been interesting and a learning experience. I wanted to share some of that with you in case you are on a similar path.
Next month, we’ll be releasing a technology hosting a small library of resources exclusively for L&D (and HR) professionals. It’s an online recommender system of high-quality learning to read, watch, practice and apply for our industry: globalfilter for L&D professionals. We’ve spent the past two years developing a patent-approved technology to get the right learning to the right learner. This is the version for our industry. It consists of conversational UI (chatbot) + 125 human-curated learning assets + recommendation system to prioritise and personalize it all.
Lori Niles-Hofmann begins her new ebook, Data-Driven Learning Design, with what I think may well just be my favourite metaphor for traditional learning & development departments…
L&D, she says, is ‘the aging elephant on the Serengeti surrounded by hungry lions and poachers. The elephant may be wise, but it is slow and cumbersome’. It’s an appealing image: paying respect where it’s due and offering solace and consolation, but ultimately reminding L&D of a stark duality…
How many roads does a man walk down? The short answer is 4.715 x 10^284, if he’s a student of our Excel course. A more useful answer is: eLearning user journeys can be non-linear. This is usually a good thing – students can access training in a pattern that suits them, personalising not just the pace of training but also its emphasis and direction in a way that would be unaffordable with conventional, face-to-face training. But the degrees of freedom available to a student make it difficult to control or even understand the user experience.
Our training courses are based on what we call ‘filtered learning’. This is the idea that our students will learn best if their training consists only of material they need to learn – skills that will be valuable in their work, and which they currently lack. Our online platform asks each user simple questions that enable us to select content for them, giving them just what they need to learn.
We think there are some obvious reasons that this is an effective approach to training: time isn’t wasted on material that isn’t valuable, and our students are more engaged in the content because of its relevance. To put it another way, the most significant cost of training is often the time for which employees are diverted away from their work; focusing only on what staff need to learn, and delivering the training efficiently online, means this time and cost is minimised. We also hoped that this focused training would be more effective than following an unselected course – that it would make a bigger (as well as faster) difference to the learner.
We have been working hard on making the Filtered approach really robust – we’ll be launching our new Filter algorithm this November. But before we embarked on this project we wanted to be sure that our hunch was right – that Filtering content really did make the training more effective for students. So earlier this year we conducted an analysis of 3000 of our real-word users who had signed up for our Excel course on our old (pre-Filtered) platform. The study measured student performance in tests before and after training, and found those users training using filtered material improved their test performance by 26% more than users training using unfiltered material. With our sample size, we are more than 95% confident that learning filtered material has a greater learning impact than following an unfiltered course.
Our new platform allows us to monitor the relationship between our users’ learning patterns and the improvement they show. So we intend to carry on measuring the effect of filtering content, in particular to understand how it affects adoption of and engagement with training, as well as ultimate impact.
Algorithms can be complex things, and many of us probably conceive of them as deeply mathematical beasts far removed from reality. Although it's true algorithms usually are complex to a layman, they are not at all removed from reality. To the contrary, as Kevin Slavin's TED talk below argues, our world is increasingly designed for and ran by algorithms. Moreover, many of these algorithms, like Frankenstein's monster, have mutated into things beyond the understanding of their creators.