Friday, May 18, 2018

A/B testing shows that Pavlovian gamification does not work

A/B testing
One of the benefits of the data revolution, is that new data techniques can be used to give insights into what works and what does not work in learning. A/B testing is one such technique. It is widely used in digital marketing and something that the world’s largest tech companies routinely use – Google, Facebook, Twitter, Amazon, Netflix and so on. You try two things, wait, measure the results and choose the winner. It only works when you have large numbers of users, therefore data points. It allows for quick comparative testing and evaluation. We are now seeing this being used in education and one of the first results is surprising. 
A/B testing on lesson plans
Benjamin Jones, at Northwestern University, wanted to know what lesson plans were more successful than others, so he randomly implemented different lesson plans in a series of A/B test and waited on the results. His EDU STAR platform delivered the plans and harvested the results of short tests, to see which lesson plans got better results. One of his first A/B tests was on the teaching of fractions using gamification v non-gamification lesson plans. One group did a straight 'Diving Fractions' lesson, the other a 'Basketball Dividing Fractions' lesson. This was an exciting experiment, as many thought that gamification was literally a game changer, a technique that could significantly raise the efficacy of teaching, especially in maths. So what happened?
Ooops!
In came the results. The gamification lesson plan fared worse than non-gamified lesson plans. There are many possible reasons for this; extra cognitive-load required for the mechanics of the game, loss of focus on the actual learning, time wasted and so on. Interestingly, the kids spent more time in the gamified lesson (on average 4.5 minutes longer) but learnt less, suggesting that interest may be trumped by poorer deep processing and learning. But all we need to know at this point is that gamification fared badly when compared to more straightforward teaching methods. Interesting.
Primitively Pavlovian
There is a growing body of evidence that points towards ‘gamification’ not being the pedagogic silver bullet that many imagine. The intuitive and popular appeal of computer games, combined with overactive marketing from some vendors, may be doing more harm than good. I have pointed towards negative results in previous articles and suspect that the primitive, Pavlovian techniques commonly employed; leader boards, rewards and badges are of less use than the more deeply structural techniques, such as levels, allowing for failure and simulation techiques. Unfortunately, the Pavlovian stuff is easier to implement. This is a complex area that requires unpacking, as ‘gamification’ is a broad term, that includes many techniques. 
A word on research…
A/B testing may be the one saviour here, in that educational techniques may be individually tested, quickly and cheaply. Traditional research takes ages and is costly. Schools need to be contacted, students selected, administration completed – this all takes time – lots of time. The experiments are also often costly and time consuming, whereas randomised experiments can be quick and cheap. In particular, online learning has lots to gain. A/B testing can improve interface design and lower cognitive load but it can also quickly identify efficacious interventions. Adding the simple button ‘Learn More’ increased sign-ups to Obama’s campaign. They identified this through A/B testing.
Bibliography
Stephens-Davidowitz S (2017) Everybody Lies. P276

Jones B (2012) Harnessing technology to Improve K-12 Education. Hamilton Project Discussion Paper.

No comments: