L&D skills gaps the chief barrier to progress in learning analytics

By John Helmer September 21, 2016

Graphic to illustrate Learning Analytics theme with graphs, etc. in a thought bubbleCapability gaps, and a historical culture of not evaluating training, are seen as major barriers to success in learning analytics for L&D. Learning analytics offers L&D a wealth of new opportunities it was found, but our Think Tank delegates identified nine key challenge areas.

Lumesse Think Tank events are held with an invited group of L&D leaders, who discuss issues in learning under Chatham House rules. Contributing to this debate were delegates from the worlds of finance, logistics, FMCG, mining, pharmaceuticals, professional services and commodities trading.

Download a highlights report of the whole discussion.

And for a deep dive into the section on challenges, read on as we address the following question:

Where are the biggest challenges/barriers for L&D in learning analytics?

Eight key challenges/barriers

  • L&D capability gaps around handling and applying data
  • Potential exposure of L&D’s weak analysis skills as their analytics become more visible within the organisation
  • Lack of ability to present data in a way that will engage other parts of the business
  • Tough data protection legislation in certain jurisdictions
  • ‘Analysis paralysis’ from suddenly having a lot more data
  • Deterrent effect on other learners of making low star ratings visible
  • L&D’s unwillingness to expose learners to the demotivating effects of being seen to fail preventing rigorous assessment
  • Test questions that are primarily designed to show the values the company wants rather than to test knowledge rigorously
  • New types of digital data not yet established as trusted forms of measurement

Challenges/barriers in learning analytics

Perhaps the biggest challenge for L&D around learning analytics lies in capability gaps around handling and applying data. Many learning departments may ‘have a background of simply not measuring anything’. So although L&D is being exhorted to be more evidence-driven, and surveys show that L&D professionals aspire to be so, many will be starting from a low base. This can hamper attempts by learning departments to become more evidence-based.

‘I put a strategy together to become an insight-led L&D function – but I quickly realized that we don’t have any insights.’

And while a big opportunity was identified by our Think Tank for L&D to align itself better within today’s increasingly data-driven organisations, starting to use a new language can be very exposing if you are less than word-perfect. Making learning analytics more visible within the organisation can expose weak analysis skills. ‘As soon as you have more data and are using it to make cases to the business, there are more people trying to blow holes in it’.

There can be problems with engaging other parts of the business with data if it is not presented in the right way — especially when it comes to the top team. One delegate recounted an incident where an initiative failed to gain traction with his C-suite for this very reason: ‘Just the way we presented the data meant that what we were saying was rejected’. Presentation of data, he concludes, is ‘incredibly important’.

Data protection can provide a challenge for those seeking to engage more with analytics, and particularly in jurisdictions where data legislation is tougher. A delegate whose company was of German parentage had found that the company’s interpretation of data law, and the cultural norms that had grown up around that, hindered him when it came to using test scores from the LMS for analytics. This was not a complete blocker to analytics, but meant that the emphasis became much more about engagement and less about performance measures.

Another problem with data comes from the sheer amount of it available. Having more of the stuff brings with it the risk of ‘analysis paralysis’. ‘Every time you present data, it raises more questions. Then you get the answer to those questions and then there are more questions …’ Having more data might well enrich your decision-making, but it won’t actually make the decision for you: ‘at some point you just have to make a decision and roll the dice!’.

Making data visible within the organisation brings the risk that what it shows might not be good. You risk exposing your failures as much as your successes. As simple a thing as making star ratings visible to the learner audience can come with challenges, since one-star rating could be a big deterrent to anyone else wanting to take the course.

And if this threat of exposure is true for L&D, it is equally true for learners. The learner’s fear of failure, and L&D’s unwillingness to expose them to the demotivating effects of being seen to fail, can prevent rigorous assessment of learning and skew data.

Where test questions are primarily designed to show the values the company wants rather than to test knowledge rigorously, the exercise will not yield meaningful data. An example was given of a leadership course, where it was actually quite difficult to get any questions wrong. This is not to say that such exercises are not useful learning interventions, but simply that the aims of the testing has to be factored in when drawing conclusions from test results.

A further difficulty comes with the very newness of many of the data sources available to L&D nowadays. The status of many new types of digital data is not established as a trustworthy measure, and can therefore make it difficult to use. Best practice has to catch up with these innovations.

In the final post from our Think Tank we give you seven practical recommendations for improving your use of learning analytics. Watch this space!

If you’d like to have a summary of the key learnings from all three parts, download the Highlights Report.

Leave a Reply

Your email address will not be published. Required fields are marked *