‘2018 was the year the dam broke in terms of technology’, said Toby Harris, Vice Chair, introducing the ELN Connect 2018 conference in London. SCORM is on the way out, ‘click-next’ elearning is history, learning professionals are turning their minds to producing quality live experiences instead, with use of new technologies such as VR growing. Developments in natural language processing, the area of Artificial Intelligence most directly relevant to the learning world, have progressed to a state where we now have AI-assisted off-the-shelf tools available, bringing down costs and barriers to entry in deploying AI. Learning skillsets are changing to embrace a new data-driven world; importing techniques as well from marketing and behaviour change.
To this optimistic vision so boldly and eloquently started the rest of the day’s talks, focusing on the nitty gritty of implementing innovative technologies – and delivered by an array of formidably expert and experienced speakers – could hardly be anything other than a series of qualifications and correctives to that view.
For Donald Clark it had not been such a wonderful year. Delivering a keynote on AI-powered design and delivery, he described 2018 as ‘odd’, and talked of a clash of cultures: ‘almost all the learning that I see is media production sprinkled with tests – and some really awful gamification … we’re in a rut and we’ve got to get out of it.’
Here are a few of the things that have to change, in his view, to get us out of the rut.
Bad games. In our industry ‘nobody really does serious gaming,’ said Clark. He cited a demolition by popular vlogger Pewdiepie – who has 17 million subscribers to his YouTube channel of a gamified US government course on cybersecurity. Learning people should embrace gaming seriously or leave it alone.
Bad tests. It’s not widely known that the optimum number of options for a multiple choice question (MCQ) is three. Anything more just adds extra cognitive load – ‘and the fourth question is always rubbish’. Better still, ditch MCQs all together.
Bad data. ‘On average humans have one testicle’. Clark sees a rising tide of people making ‘stupid inferences’ from limited data sets in training and education. Be very careful of these projects. The reality is, we don’t have ‘Big Data’ as such in our field: the datasets are too small.
Bad science. ‘63% of children entering primary school will do jobs not invented yet’, ‘47% of jobs will be automated…’: statistics with no scientific basis are being used to scare people about the future of AI and we have to stop propagating this stuff, says Clark.
He called for a more clear-eyed view of AI that would neither underestimate its potential power to help human beings or underplay its serious shortcomings at the current stage of development.
AI, the ‘idiot savant’
AI is not the panacea for all problems it is sometimes touted as but an ‘idiot savant’, excellent at doing a particular narrow task but with no lateral or contextual awareness. He gave the example of his Roomba, an AI-powered robot vacuum cleaner that operates without human assistance. It does a great job – apart from the one occasion when a dog left a deposit on the sitting room floor, which the Roomba dutifully and with scrupulous thoroughness smeared into every corner of the house!
Notwithstanding such occasional accidents, AI is improving by leaps and bounds. A recent experiment among 350 students swapped one of a number of human online assistants for a bot and nobody noticed any difference – apart from the fact that responses came back much faster. This increased rate of response was experienced as a considerable benefit, given the generally slow speed in getting an email response from a human tutor.
AI is now omnipresent in the environment. If you use a mobile phone, search on Google, interact on Facebook or Instagram, or watch Netflix, you are embedded in the AI world. Practically everything is mediated by AI – except learning.
Donald Clark ran through a variety of ways in which AI can help learning.
Language learning. He praised the language app Duolingo, that not only remembers where you are in your learning, but knows you will forget stuff when you are away for a while, so will drop you back a step or two when you return. He also revealed that he has his Alexa set up to answer in German, to aid his learning of that language. ‘It’s like having German in your kitchen’.
Content creation. Already, AI is generating a lot of journalistic written content, and the world’s first AI news anchor was recently unveiled in China. AI-generated content is growing fast. Having spent much of his life building online learning content, which he describes as really hard and time-consuming, Donald is now associated with a company, Wildfire, that uses AI to build online courses directly from source materials.
Adaptive learning. Big in the US (though less so in the UK as yet), adaptive learning uses AI to create personalised learning paths through a structured body of knowledge on the fly; ‘a satnav for learning’. In fields like maths especially, where learning so often fails through lack of underpinning knowledge (e.g. you can’t do logarithms if you haven’t nailed multiplication first) adaptive can show impressive results.
Bots. Learnbots, mentorbots, chatbots … the future is all about automating conversations, with new dialogue skills needed perhaps more often associated with scriptwriting than prose. Bots can also be a way of improving the LMS experience by making it more ‘invisible’ and embedding it in workflow.
Wellbeing. On the subject of bots, Donald commended Woebot. In studies it performed no better – but no worse – than a human therapist.
Proctoring. Face recognition technology is being used already to ensure the identify of learners taking online exams and in the future could be used for everyday educational functions such as taking the register.
Summing up, AI brings with it the need for new technology, skills, processes, retraining for some – and a new mindset for all.
Key takeaway / parting shot: ‘Resistance is absolutely futile!’