Learning analytics: 8 opportunities for L&D
By John Helmer
Learning analytics offers L&D a wealth of new opportunities to increase the effectiveness of training and to be better aligned with organisational goals in today’s data-driven business environment. Our Think Tank delegates identified eight key opportunity areas.
Lumesse Think Tank events are held with an invited group of L&D leaders, who discuss issues in learning under Chatham House rules. Contributing to this debate were delegates from the worlds of finance, logistics, FMCG, mining, pharmaceuticals, professional services and commodities trading.
Download a highlights report of the whole discussion.
And for a deep dive into the section on opportunities, read on as we address the following question:
Where are the biggest opportunities for L&D in learning analytics?
Eight key opportunities
- Drawing on and bringing together a wider, more diverse range of data sources and triangulating them to give a firmer basis for decision-making
- Using data to foster conversations between departments responsible for different stages of the employee lifecycle
- Using data from customer interactions to improve the design of learning
- Using realtime analytics on webinars and virtual classroom events to measure and improve engagement
- Learning from how marketers use analytics
- Using xAPI to track learning beyond the LMS
- Using induction/on-boarding for cohort analysis
- Using real-time analytics to feed back within a programme how the individual’s responses compare or contrast with peers
Opportunities in learning analytics
Some of the biggest opportunities in learning analytics will come from bringing together different data sources to gain insights. The example was given of using focus group data to confirm or drill down into questions raised by LMS data, leading to greater sureness of judgement. This process was characterized as ‘triangulation’.
‘The more data you have, the more opportunity you have to triangulate … which gives you a much richer picture.’
While bringing together different data sets was seen as an opportunity, so too was bringing together different perspectives within the organisation on shared datasets. An opportunity was seen in fostering conversations between departments responsible for different stages of the employee lifecycle to tackle particular issues. For example, a problem of seasonal churn was addressed by bringing together recruitment, on-boarding and learning data and the relevant departmental heads – making a link between competency framework, performance management data and learning.
Another delegate talked about using data to make a stronger link between frontline operations and learning design. Now that many customer service interactions are digital, it is possible to feed back data gathered from customer interactions and use the insight gained to drive the design of the learning provided for those employees – e.g. if an issue has been observed in the way customers are dealt with, the induction process for frontline staff can be remodeled to head off that issue before it ever arises.
Learning departments are already seeing opportunities to use realtime learning analytics in areas such as virtual classroom — although they might not consciously see it in his way. Some webinar and virtual classroom platforms include engagement icons that tell presenters of an online event who is engaged and who has turned off, giving the presenter the opportunity to change the content in real time or provide additional support where needed. These generally work by indicating where a user no longer has the webinar/VC as her primary screen, but has gone off to answer emails, etc. There is some dispute whether this is a real measure of engagement – people often multitask after all. However research tends to show that multitasking is something of a myth; that people who appear to be multitasking are actually switching their attention sequentially from one task to another, and are therefore liable to miss things.
… And on the subject of engagement, looking at how marketers use analytics was also seen as an opportunity in finding the best way to engage audiences of learners.
‘There are lots of things you can learn from marketers for learning …’
At a practical level, for example; learning departments might want to know when the best time of day/week is to send out communications for a piece of learning. Marketers have heuristics and their own specific data sets that tell them when the audience is most likely to be responsive. They will also routinely look at Google Analytics and other analytics packages to see what devices are used, the time of day pages are accessed, and other factors that can point to what is happening with their audience and help build up a picture of ‘the context of use’.
This analytics-driven mindset is characteristic of marketing as a discipline and forms the bedrock of how marketers work to engage audiences. One of our delegates shared how he had seen this type of skillset applied very directly to learning.
‘A guy moved from Marketing into Learning and Development, and within three or four months he had transformed their whole operation. He took all the lessons he had learned from being a marketer for so many years and applied them to learning …. and the thing he really managed to do in a big way was engage the audience.’
On a more technological theme, xAPI, the tracking technology and successor to SCORM, formerly known as ‘Tin Can’ seemed to have some at least one confirmed fan among our delegates:
‘Personally, I think xAPI is a massive opportunity for learning analytics … by 2020, the learning management system will not exist: it will be the learning record store.’
However, some participants had never heard of xAPI – and nobody in the group had an active implementation. It is moot point whether the slow takeoff of this technology in the learning world is more to do with the tech capabilities of L&D or issues with the standard itself. But it is worth noting that other updates to SCORM over the years have similarly struggled to gain traction.
Induction and on-boarding were seen as good opportunities for gathering and using analytics. Induction often involves cohorts passing through a common process, with fixed time markers, making it ideal for certain types of analytical study. Three examples were given:
- Analysing how members of a cohort feel about the organisation as they pass through induction using surveys
- Doing the same kind of exercise as above, but using sentiment analysis of social channels
- Using a control group given no induction at all to see if they still find what they need – i.e. testing the efficacy of a formal induction process
Lastly, another opportunity was seen in real-time (or near-time) analytics around quizzes within elearning programmes. Increasingly, it is possible to feed quiz results back into elearning solutions to provide a benchmark – so that individuals can see how their peers answered a given question and compare their own response.
In the next post from our Think Tank we look at the other side of the coin – challenges and barriers. Watch this space!
If you’d like to have a summary of the key learnings from all three parts, download the Highlights Report.