Why learning analytics?
The problem with Learning Analytics, is that it can be as much of a trap than saviour. While it tops the poll of future trends in online learning it suffers from ‘dashboard dogma’. Many presentations around learning analytic are full of dashboards with no end of pie charts and donuts, but this is the confectionary of data analytics. If your end point is a ‘dashboard(s)’ you’re merely looking through the window of the bakery. I write in detail about this in my new book 'AI for Learning' including types of learning data, LXPs, LRSs and using data to create a learning organisation but here's an introduction to my thoughts on learning analytics.
Trends
Learning Analytics leapt in at Number 1 on Donald Taylor’s International Survey. What’s interesting is the next four positions, as they are all related:
1 Learning Analytics
2 Personalised/adaptive learning
3 Collaborative/social learning
4 LXP (Learning Experience Platforms)
5 Artificial Intelligence
I’ve been working solely on this cluster of things for the last six years and want to tease out a simple point – they have to be seen as a whole as they are all intimately related:
1 fuels 2/3/4/5
4 is the delivery platform for 1/2/3/5
5 really matters
Dashboard trap?
Beyond numbers of people taking courses, literally ‘bums on seats’, which is to measure the wrong end of the learner, learning has never been particularly analytic. Few collect detailed data to describe learner behaviour with even basic analysis. Fewer still delve deeply into that data for insights to inform, predict or prescribe future decision making and action. It is not clear that dashboards improve the situation much. It is still a descriptive ‘bums on seats’ approach to data. Visualisation, in itself, means little. One visualises data for a purpose - in order to make a decision. It often masquerades as doing something useful, when all it is actually doing is acting as a cul-de sac.
The dashboard on your car is there primarily so that you can regularly monitor your speed. A secondary and less regular use is to monitor fuel. Even rarer are signals to indicate when temperature rises, oil or tyre pressure needs attention. In fact, most of the data use in a car works invisibly, on your engine, brakes and other systems. In other words, it automates the use of data. As cars have become more sophisticated they take tasks away from you and use data to automate processes. This will reach its zenith with self-driving cars. In learning, no one dies, so we can move towards automation much quicker.
The learning world attracts ‘people’ people, with an interest in the development of others, rather than many from a scientific or analytic background, with an interest in systems and data. This, in a sense, pushes learning professionals away from learning analytics. We must overcome this reliance on qualitative perceptions and judgements, including the old and laboured Kirkpatrick schema, which is statistically naive.
It is time to move towards a more serious, data-led approach. But relax, this does not mean becoming an expert in data or statistics, as the technology does all of the computation and heavy lifting on the maths and stats. Learning professionals will not be analysts but consumers of data and data-driven automation. The analysis will largely be done for them.
Decision making
In the end this is all about decision making. What decisions are you going to make on the back of insights from your data? Storing data off for future use may not be the best use of data. The least efficient use of data is storing it in pots with dials on the front. Perhaps the best use of data is dynamically, to create courses, provide feedback, automatically deliver adaptive learning.
The world is becoming more data-driven, organisations more data-driven and even at the level of the individual, personalised service is an expectation. Almost everything you do online is data-driven (search, all social media, Amazon, Netflix etc). Yet learning remains stubbornly resistant. But it is time the learning world responded by being sensitive to this need for data as fuel, and algorithms as the rocket, that will allow us to boldly go to places we have never been before.
Data is everywhere. You are a mass of data points, your face is data for face recognition, your body a mass of data points for healthcare, your behaviours area data points for online organisations, you network online with other people and information which are all data points, your car is a data point for GPS. You are and live in a sea of data. This is also true of learning. What you know, when you learnt things, how well you know things, your performance. Like it or not, you are all masses of data points. This doesn’t diminish your humanity, it informs decision making and can make life easier and more productive.
If you have a LMS (Learning Management System) you will, most likely, have been gathering data under the SCORM specification. Unfortunately, this has an old and severely limited capability, as it focuses on who did what, when and did they complete courses. If you have been using Kirkpatrick, you will most likely have been gathering the wrong data with little analysis. It is time for a rethink.
Moving beyond this specification, xAPI has been defined by the same people who gave us SCORM. This is a new specification more suited to the current landscape of multiple sources for learning and a more dynamic view of how people learn, along with a need for many more types of data than in the past. Similarly with then shift from the LMS to LXP. That is why LXPs and learning ecosystems have appeared. The world has moved on, organisations have moved on, the technology has moved on and so learning professionals should move on.
This means leveraging data to be more focused, efficient and aligned with your organisation’s strategy. It should lead to better decision making, more action, more automation and provable impact on the business – not just dashboards.
How to start?
Don’t think just dashboards. They trap you in a world of reading what IS the case, rather than deciding what SHOULD be the case. We need to derive an OUGHT from an IS and push them beyond dashboards to decisions, actions and the automation of process.
What we need is strategic view of data use. Here’s the good news, a schema has existed for a long time in data science, classifying data use into five areas:
5 Levels of data use in learning:
Level 1: Describe
Level 2: Analyse
Level 3: Predict
Level 4: Prescribe
Level 5: Innovate
This allows you to think ambitiously about data, moving beyond mere description (dashboards) towards using to help improve and shape teaching and learning.
Level 1: Describe
What does learning data tell us about what is happening?
Data that describes what is the case, describing learners, their behaviour and the technology is descriptive. That’s what dashboards do. This is the simple world of tracking and visualisation. Don’t get stuck with dashboards only – they are merely descriptive.
Level2: Analyse
What does learning data tell us about why it’s happening?
Analysis gives you deeper insights into data, evaluation, business performance, ROI and may, even at a simple level, provide useful insights in terms of informing decisions and action. Don’t worry, the software should do the analysis for you – you don’t have to become a data scientist!
Level 3: Predict
What does learning data tell us that is likely to happen?
Data that predicts what your organisation or group or individual learner performance are likely to be can predict performance, predict dropout and recommend action. Recommendation engines drive most of what you do online (Google, Social Media, Amazon, Netflix). This allows you to deliver personalised learning.
Level 4: Prescribe
What does learning data tell us should happen?
This is where data makes things happen. Nudges and other push techniques can be executed, spaced practice applied, personalised and adaptive learning applied. The software literally uses data to enact something for real. This is how data is actually used in the real world, to automate processes.
Level 5: Innovate
How can learning help us innovate?
Beyond our basic four levels, lies the use of data for more innovative uses in learning such as sentiment analysis, content creation, curation and chatbots. There is a wide array of data-driven techniques that can be used to bring learning into the 21st century.
False idols
One can decide to let the data simply expose weaknesses in the training. This requires a very different mindset, where the whole point is to expose weaknesses in design and delivery. Is it too long? Do people actually remember what they need to know? Does it transfer? Again, much training will be found wanting. To be honest, I am somewhat doubtful about this. Most training is delivered without much in the way of critical analysis, so it is doubtful that this is going to happen any time soon.
One could also look for learning insights into ‘how’ people learn. I’m even less convinced on this one. Recording what people just ‘do’ is not that revealing if they are clickthrough courses, without much cognitive effort. Just showing them video, animation, text and graphics, no matter how dazzling is almost irrelevant if they have learnt little. This is a classic GIGO problem (Garbage In, Garbage Out).
Some imagine that insights are buried in there and that they will magically reveal themselves - think again. If you want insights into how people actually learn, set some time aside and look at the existing research in cognitive science. You’d be far better looking at what the research actually says and redesigning your online learning around that science. Remember that these scientific findings have already gone through a process of controlled studies, with a methodology that statistically attempts to get clean data on specific variables. This is what science does – it’s more than a match for your own harvested data set.
Business relevance
Learning departments need to align with the business and business outcomes. Looking for correlations between, say increases in sales and completed training, gives us a powerful rational for future strategies in learning. It need not be just sales. Whatever outcomes the organisation has in its strategy needs to be supported by learning and development. This may lift us out of the constraints of Kirkpatrick, cutting to the quick, which is business or organisational impact. We could at last free learning from the shackles of course delivery and deliver what the business really wants.
Learning departments need to align with the business and business outcomes. Looking for correlations between, say increases in sales and completed training, gives us a powerful rational for future strategies in learning. It need not be just sales. Whatever outcomes the organisation has in its strategy needs to be supported by learning and development. This may lift us out of the constraints of Kirkpatrick, cutting to the quick, which is business or organisational impact. We could at last free learning from the shackles of course delivery and deliver what the business really wants.
Another model is to harvest data from training in a diagnostic fashion. To give a real example, they put the employees of a global bank through simulation training on loan risk analysis and found that the problems were not what they had imagined - handing out risky loans. In fact, in certain countries, they were rejecting ‘safe’ loans - being too risk averse. This deep insight into business process and skills weaknesses is invaluable. But you need to run sophisticated training, not clickthrough online learning. It has to expose weaknesses in actual performance.
Business
You may decide to just get good data and make it available to whoever wants to use it, a sort of open data approach to learning. But be careful. Almost all learning data is messy. It contains a ton of stuff that is just ‘messing about’ – window shopping, In addition to the paucity of data from most learning experiences, much of it is odd data structures, odd formats, encrypted, in different databases, old, even useless. Even if you do manage to get a useful clean data set, You have to go through the process of separating ‘Personal’ from ‘observed’ (what you observe people actually doing), ‘derived’ making deductions from that data, ‘analysed’ (applying analysis to the data). You may have to keep it ‘anonymised’ and the privacy issues may be difficult to manage. Remember, you’ll need real expertise to pull this off and that is in very short supply. A LRS (Learning Record Store), such as Learning Locker, is a good start.
You may decide to just get good data and make it available to whoever wants to use it, a sort of open data approach to learning. But be careful. Almost all learning data is messy. It contains a ton of stuff that is just ‘messing about’ – window shopping, In addition to the paucity of data from most learning experiences, much of it is odd data structures, odd formats, encrypted, in different databases, old, even useless. Even if you do manage to get a useful clean data set, You have to go through the process of separating ‘Personal’ from ‘observed’ (what you observe people actually doing), ‘derived’ making deductions from that data, ‘analysed’ (applying analysis to the data). You may have to keep it ‘anonymised’ and the privacy issues may be difficult to manage. Remember, you’ll need real expertise to pull this off and that is in very short supply. A LRS (Learning Record Store), such as Learning Locker, is a good start.
Conclusion
There’s a ton of learning technologists saying their new strategy is data collection in 'learning record stores' and 'learning analytics'. On the whole, this is admirable but the danger is in spending this time and effort without asking ‘Why?’ Everyone’s talking about analytics but few are talking about the actual analysis to show how this will actually help increase the efficacy of the organisation. Some are switched on and know exactly what they want to explore and implement, others are like those that never throw anything out and just fill up their home with stuff – but not sure why.
Learning analytics is too often seen as
Level 1: Describe - Dashboards
Level 2: Analyse - Insights
Level 3: Predict - Foresight
Level 4: Prescribe - Action
Level 5: Innovate – Innovative actions
Of course, all of the above is fine in theory. In practice, organisations have different capabilities. As usual with new paradigms, there is a maturity curve, although that involves a wider set of criteria, including:
People/culture
Systems/processes
Technology/resources
Having spent the last few years doing all of the above, I think we are about to enter a new era, where smarter software (AI/data-driven) will deliver smarter solutions. I now see real clients use data, not just to produce dashboards, but to drive engagement, learner support, content creation, curation, assessment, sentiment analysis, chatbots and so on. My book ‘AI for learning’ is now available on Amazon. Happy to help with any of this stuff… DM me on Twitter or contact me on the form here…