Is L&D ready for learning analytics?

By John Helmer

Graphic to illustrate Learning Analytics theme with graphs, etc. in a thought bubbleLearning professionals are reaching out beyond their traditional data sources and methodologies to embrace a new world of learning analytics. However, innovation is sporadic and held back in many organisations by a historical culture of not evaluating effectively (if at all).

This was just one of a number of fascinating insights that arose from our latest Think Tank dinner.

We assembled an invited group of L&D leaders to discuss these issues in a three-part discussion held under Chatham House rules. Contributing to the debate were delegates from the worlds of finance, logistics, FMCG, mining, pharmaceuticals, professional services and commodities trading.

Download a highlights report of the discussion.

But for those who want a deep dive into the first part of this fascinating discussion, read on, as we address the following question:

Part 1: What examples can we see of organisations using learning analytics and insights in new ways?

Key points from Learning Analytics

  • Learning departments today are using a huge variety of methods to gather insights about the impact of the learning they provide
  • Opinion is divided about the value of learning analytics provided by ‘traditional’ LMS reporting
  • New techniques of data visualisation (e.g. dashboards) are becoming a powerful tool for providing actionable insights from data
  • Organisations with large and highly structured data sets (e.g. professional associations that administer exams-based qualifications) are currently best placed to take advantage of sophisticated new tools available for data analysis and presentation
  • Return on Investment (ROI) is starting to be seen as a less important measure than return on engagement (ROE)
  • It is questionable whether learning departments are active enough in reaching out for commercial data from the business
  • The perception within L&D tends to be that HR teams have a more analytics-driven culture
  • It is possible (though hard to prove) that the recent move to shorter, more ‘chunked’ digital learning has been at least partly driven by greater access to data and analytics within learning departments

‘Digital is taking over everything: you can track by far more things’

Learning departments today are using a huge variety of methods to gather insights about the impact of the learning they provide.

These encompass both traditional and non-traditional methods, including focus groups, surveys, sentiment analysis of social media channels, Amazon-style star ratings, Net Promoter Score (NPS) and more. L&D is beginning to look beyond traditional data sources such as LMS data: two of the companies represented had done some sentiment analysis on social media channels to sound feeling within the workforce. However, use of such innovative methods is sporadic, and could be held back by capability gaps.

Although there seems to be widespread use of social collaborative working platforms in companies (e.g. Slack, Jive, etc.) not much analysis goes into what is said in all these conversations by learning departments.

Opinion is divided about the value of learning analytics provided by ‘traditional’ LMS reporting.

Some think the information that comes out of learning management systems is ‘a waste of time’. Others say that you turn your back on it at your peril, or that in combination with other data sources, it can provide rich insights. ‘You’ve got this broad, shallow LMS data that can nudge us in the right direction, but it is that deep dive into a small audience [through focus groups] that really gives us the rich stuff when we are thinking about [learning] design’.

New techniques of data visualisation (e.g. dashboards) are becoming a powerful tool for providing actionable insights from data.

Presentation of data is not a trivial or cosmetic issue. With the amount of data available to businesses growing exponentially, it becomes increasingly difficult to draw meaningful insights from raw data. Visualisation techniques such as dashboards help enormously in providing such information in a digestible and understandable form that makes the information actionable.

Learning departments don’t necessarily have access to data analysts to interpret data for them. Neither do they necessarily have data analysis skillsets in their teams. So these kind of presentational tools, which contain a certain amount of baked in analysis, are especially useful.

‘Historically, we haven’t measured anything … The learning team don’t have the first clue about any kind of data mining whatsoever.’

Organisations with large and highly structured data sets are currently best placed to take advantage of sophisticated new tools available for data analysis and presentation.

Much of the running in learning analytics has been made in the education sector, and less in organisational learning. It can be speculated that this is because, in education, exam data provides a large, mission-critical and highly structured pool of data that provides an obvious focus for improvements in analytical techniques. In addition, there has been considerable investment in edtech from large publishers (e.g. Pearson) focused on this area.

In most of organisational learning, by contrast, evaluation has not had the same type of central importance. This can be a source of frustration for those who wish to be more evidence-based, where the learning departments they work with have no past history of evaluating effectively. Many organisations just don’t evaluate the impact of their training. ‘The truth is that nobody measured anything, and if they measured anything they didn’t know what they were looking at.’

There are of course exceptions to this generalisation – e.g. professional associations that administer exams-based qualifications – and it is here, where exams play a central part, that organisations are perhaps best placed to capitalize on advances in learning analytics technology.

Return on Investment (ROI) is starting to be seen as a less important measure than return on engagement (ROE).

ROI is too often associated with cost cutting and is difficult to pick apart from other inputs. Of more interest now is ROE — how engaged people are with the learning and what effect it is having on their behaviours. The new analytics is providing measures (like sentiment analysis) that make this easier to assess.

It is questionable whether learning departments are active enough in reaching out for commercial data from the business.

In measuring the effect of training on behaviours and performance it can often be essential to get data from the business. One of our delegates felt that his peers did not do enough of this reaching out: ‘L&D needs to get out of its fluffy HR world and smell the coffee’.

The perception within L&D tends to be that HR teams have a more analytics-driven culture.

Certainly, many HR solutions (including talent software) provide impressive and good-looking reports. However, whether HR are actually where they want to be with analytics is another question, one of our delegates hinted.

More generally, although organisations are seen to be ‘desperate to invest’ in data analytics, actually it is hard for many of them to make the strides they would like to make.

It is possible (though hard to prove) that the recent move to shorter, more ‘chunked’ digital learning has been at least partly driven by greater access to data and analytics within learning departments.

In one interesting example, related by one of our delegates, insights from data about the consumption habits of staff led to a change of content strategy. While elearning programmes were an hour plus in length, 79% of this organisation’s learners spent less than 30 minutes doing it, and 56% less than 15 mins. Based on this type of information, the learning department changed the approach; ‘chunking down’ learning programmes to bite-size and using more YouTube videos.

Whether or not learning analytics are the cause of this move to shorter pieces of learning content, it is certainly the case that more and more organisations are moving in this direction.

In part two of our Think Tank we look at opportunities and challenges in learning analytics. Watch this space!

If you’d like to have a summary of the key learnings from all three parts, download the Highlights Report.

Leave a Reply

Your email address will not be published. Required fields are marked *