Tuesday, February 20, 2018

We don’t need no stinkin’ badges? Why the badges movement has literally run its course

I’d have loved the idea of learning badges to have worked – motivational dynamo, more fine-grained rewards and accreditation. The inconvenient truth is that the idea has failed. This is not for want of trying but a classic case of supply not matched by demand. To put it another way, we built it and they didn’t come to the party. Sure you’ll find some localised examples of success but overall, as a significant movement, it has literally run its course - few are now interested.

1. Lack credibility
The main problem has been credibility. When explicit accreditation is not anchored in a major accreditation body with quality and standards, there is no real anchor in the real world. You are up against recognised accreditation with branding, marketing, frameworks, objective assessment and longevity. Overbadging and weak badging have added to the problem of credibility. Badge projects are here today gone tomorrow, mosquitos not turtles.

2. Lack objectivity
A lack of objectivity, in terms of recognition in the real word has plagued their progress. What happens when you take your badges outside of your institution or course, and no one has ever heard of them and don’t care? Simply badging content is a mistake. This is about real people feeling that they are useful, not lapel badges. If your currency is not recognised in the currency exchange, then you’re left with useless paper.

3. Motivationally suspect
They were always motivationally suspect. Extrinsic rewards should always be treated with suspicion. And there is something suspect about badges for online, but not offline, stuff. You can’t slice and dice learning by mode of delivery. The ‘Overjustification effect’ shows that Intrinsic motivation will decrease when external rewards are only given for completing a particular task or only doing minimal work. This is not to say that all extrinsic motivation is useless, only that superfluous extrinsic motivation is damaging to learning. The failure to escape this trap is a major problem for most badge schemes.

4. Not really gamification

The idea that they are a great gamification feature is misleading. Zsolt Olah, of Amazon says "It was an easy target for shallow gamification (look, here’s a lot or points to take useless courses to see yourself on the leaderboard and show off your badge) on the lms. Folks, people don’t give blood because they get a sticker. It’s the other way. Pavlovian rewards have a limited effective learning, which is why so much Pavlovian gamification runs out of steam. Leaderboards, collecting badges and so on. Real gamers are intrinsically motivated by the game, its reputation, their experiences of games, their peers views of games and so on. They do not buy and play games because of the scoring system or badges. Bad learning games or gamification techniques are often just a pale imitation of massively popular gaming.


5. Badges don't travel
When your badges get stuck in a proprietary system, repository or e-portfolio, with little in the way of interoperability, they’re effectively imprisoned. Badges are often rendered useless by their failure to escape the bounds of their small ecosystems, technical and cultural. Mozilla have, since 2011, tried to provide a framework and structure. I applaud their efforts but the early paper “Open Badges for Lifelong Learning” was hopelessly utopian. A more achievable vision was needed. The most successful badge system I’ve seen is in IBM – but it is in IBM – that’s it. They tend to remain stuck and siloed inside the organisation that promotes them. Badges don’t travel well.

6. Awful branding
Another problem was branding. Making your badges look like silly, clip-art stickers, makes the whole thing look amateurish. For badges to work they needed some serious marketing and design – Mozilla tried but what we got was almost no marketing and sometimes comically bad design. In addition, it always had that boy scout, girl guide feel – something suitable for earnest young people but not adults. Perhaps it was the word ‘badge’ that was a mistake – something with almost trivial connotations.

7. Mis-measurement
When people started to get badges for simply attending conferences, I got worried. The motivation for conference attendance is not always learning. It is often the extrinsic reward of travel and time off. How do you measure the usefulness of that attendance? We could say, did you tweet out session, blog and distribute your findings to your fellow employees, write a paper suggesting new implementations based on what you learnt? Badges for just turning up don’t wash it for me. A real problem here is that badges often don’t match real learning and are rarely measured in terms of impact.

Conclusion
We need less, not more, credentialism. Badges were always a bit childish and tacky. Employers don't ask for them, people don't care about them and they've become meaningless artefacts in systems that put the artefacts of learning above actual learning. Whether you see badges as motivational devices, credentials, actual assessments, even evaluative, if they don’t catch on, they’re dead in the water. In short, they’re dead in the water.

Sunday, February 18, 2018

Tyranny of time – why learning often wastes time...

Our learning world, at all levels, in schools, HE and the workplace, including offline and online learning, suffer from one obsession that leads to massive waste and low productivity – an obsession with time. In here Covid times, we would be well advised not to carry over our bad timetabling habits into the online world.

The metrics almost universally cost ‘teaching and learning’ like sausages…. by the pound/kilo - face time, contact time, fixed length courses, and hour of learner time in online learning. All are metrics that work against efficient delivery. The tyranny time is the disease that disables the learning world and by altering how we see 'learning time, a lot of time, money and wasted effort by teachers and learners could be saved.

Higher Education
The one hour lecture, that pedagogic staple in HE, is an hour long simply because the Sumerians had a base-60 number system - hence the ‘hour’. It bears no relation to the psychology of attention or efficient pedagogy. It is quite simply the slavish adherence to a fossilised method of delivery that is easy for faculty to timetable. Even then, attendance is often appalling (40% don't turn up even at Harvard), and often not recorded, rendering even rudimentary attempts at measurement meaningless. University terms still adhere to a mixture of old professional timetables (primarily the Law profession), religious festivals and an 18th century agricultural calendar, with long holidays, that are periods of forgetting. Fixed three and four year length degree courses with only one start date per year, take no account of actual needs. Oh, and lets build and market ‘Masters’ Degrees to that we can add yet another year. Nowhere is the tyranny of time more crude and obvious than in Higher Education.

Schools
Similarly in schools, that mimic Universities, as they must be kept in sych. This is another form of tyranny as schools are now their feeders, despite the fact that the majority of young people do not take that route. The ‘period’ in schools mimics the ‘lecture’ where millions of young people pack up, stand up and shuffle through crowded corridors to another identical room where they have to unpack, sit down and settle again. This waste of time is immense. Imagine running a company where all employees have to rise on the hour and move somewhere else? Can't teacher's walk? And again the tyranny of an ancient calendar, where unhealthy doses of forgetting punctuate the year, determine the rhythm of learning, which should be steady... not full on, nothing, full on, nothing…..

Workplace
An obsession with padded out ‘courses’ from compliance to whatever fad arises (Emotional Intelligence, NLP. Mindfulness and so on) means days of wasted time doing courses that have little or no effect on performance. Get people to travel from all over, then batch people through, in dull rooms with round tables, bowls of mints, coloured pens and some half-baked attempt at collaboration, where you throw out a vague question, discuss at the table, feedback on flipchart paper, which gets pinned on the wall, then the promise that the results will be sent to you – they never are. These courses are always delivered by the half-day, full day, or worse, days on end and when it comes to impact the adherence to a ridiculous mode of evaluation (Kirkpatrick) means very little is meaningfully measured.

Online learning
Just as bad is online learning, bought and sold by the ‘learner hour’, mimicking the University and school model. Rather then focus on value and the idea that this really can save time, it encourages vendors to over-deliver so that they can charge more. The net result is overdesigned content, with oodles of meaningless, illustrative graphics, thinly punctuated by multiple-choice questions, and maybe some Pavlovian gamification (so that a premium price can be paid). Even MOOCs were foolishly deigned to match University semesters, with a drip feed of content over up to 10 weeks – and they wonder why people fell to the wayside?

What to do?
So the tyranny of time comes in many guises, the lecture, period, semester, term, course, clickthrough online learning and degree. Some make it worse by recommending lifelong learning, in the form of going back to college – life as one long course. No thanks. Life is far too short for that nonsense. By and large all of these take too long as they suffer from the following flaws:

1. Fixed form of delivery
Most ‘teaching and learning’ is shaped by pre-existing, fixed modes of delivery, the lecture, period, term, module, course and so on. This ‘ass before elbow’ mode of delivery should be shaped by the type of learning, needs of learners and resources, not mode of delivery. The solution is to imagine that the learning experience doesn’t exist, take it back to a blank slate, now re-design. Match modes of delivery to the typology of learning, learning needs and resources. Look to make everything shorter and more efficient for the learner. Some call this Blended Learning - that doesn’t mean a bit of online bolted on to a bit of classroom, let’s call that Velcro Learning, and don’t confuse Blended ‘Learning’ with Blended ‘Teaching’, where you simply slice and dice a bit of your old and new delivery methods and call it a ‘Blend’. Escape the tyranny of time and focus on value.

2. Sheep dip
Most teaching is a one-off event. It is ridiculous not to record lectures, even if you think it’s a poor form of pedagogy (which I do). Denying learners a second and third bite of the cherry is ridiculous – they may be ill, miss points, not understand at first pass, have trouble note taking, have the language of teaching as a second language. Above all the psychology of learning shows that repeated access for reinforcement and retrieval through revision is necessary for efficient learning. There is a strong argument for doing the same in schools. I’ve seen this work magnificently in an Italian school, yet few have ever thought about doing it.

3. Forgetting
Let’s not forget that single, fixed timetabled events ignore a well known principle in learning – that the brain forgets almost everything it’s taught. Ebbinghaus showed us this in 1885 and the learning world has studiously ignored the principle that learners need, not repetition but retrieval and deliberate practice. Learning needs to be repeatedly accessible, say through recorded lectures right through to spaced practice techniques such as top and tailing, note taking, repeated testing, up to algorithmically determined, personalised deliberate practice. Deliberate, effortful, spaced-practice frees learners from the tyranny of single event, sheep-dip learning.

4. Batching
Courses tend to batch learners who have to go through the linear course at the same pace. In any group you will have a distribution curve, where you only hit those in the middle. There will be tails of learners who find the experience too slow or too fast. Personalised delivery, now possible through adaptive, online learning, allows you to deliver learning to an individual, informed by their progress and aggregated data from all who took the course before. This results in increased attainment and lower dropout.

5. Less is more
In designing learning experiences, the ‘Garbage-In Garbage-Out’ rule is not taken seriously enough. I’ve seen far too many long compliance documents and over-engineered courses throw far too much detail at learners. Lecturers pad out lectures to fit their ‘hour’. Course designers fill out a timetable with unnecessary content and activities. The net result is actually lower learning, retention and recall. Cognitive overload results in less, not more, being retained. Research from large data sets has shown that video in learning tends to fall of a cliff at around 6 minutes. The consequence being that video should be that length or shorter.

The psychology of learning screams ‘less is more’ at us. Cut down documents until they bleed then cut them down again, so that the content is learning ready. There are few courses I’ve seen that can’t have up to 25-30% cut out – all the padding. There is no doubt that lecturers pad out to the hour, the same with classroom teachers and organisational trainers. Rather than plan to fill the time, like an empty vessel that needs to be topped up, look at making the learning experience as short as possible. Think about what learners ‘must’ learn, not generally what they ‘could’ learn. Of all the techniques to free learners from the tyranny of time this one is by far the most productive. This is precisely what we've been doing with WildFire - using AI to cut down on content creation time and learner time with crisp, open-input, online learning.

6. Failure to chunk
Chunking is a pretty basic pie of learning theory – that our working memory is limited and that throwing overlong learning experiences at the learner is counter productive. It happens all the time. We teach people to write essays by repeatedly getting them to ‘write essays’ rather than breaking that task down into its constituent parts. Whole word teaching was an almost perfect example of this approach to teaching that resulted in catastrophic failure in reading in UK schools. Learning experiences have to have focus.

7. Digital by default
Rethinking learning around, not existing modes of delivery and fixed timetables, but more flexible methods of delivery that suit the type of learning, learners and resources is badly needed. More often than not this means more 365/24/7 availability by being online. Being digital by default, wherever is practical, turns time-tabled learning experiences into anytime learning. Asynchronous often makes more sense than synchronous, even of its recorded lectures and resources. Switch away from a dependence on courses to an on-demand model.

Conclusion

In practice, as you get older and become a more self-sufficient learner, you realise that freedom from the tyranny of time is the real trick to learning. You literally ‘learn’ how to learn by being measured, having focus, rehearsal, retrieval – by avoiding the waste of time that are courses and degrees. That’s lifelong learning. Life is short, it's made even shorter by wasting so much time learning and not living.

Tuesday, February 13, 2018

Woe is me… my 10 days being counselled by a chatbot (woebot)…

Woebot is a counselling chatbot – I’m not big on mentors and counsellors preferring the “get a life not a coach” approach. What I liked the most about the experience was the anonymity of the experience. I'm pretty sure most people don't actually want to go to their parent, teacher, faculty member or a stranger with their problems and would relish an anonymous service. The clinical paper on Woebot suggests that this is the case. So I gave it a go, for research purposes only you understand…

Day 1
Started with a series of friendly exchanges, where you have little choice in options but that’s fine – it sets the tone. Couple of things I liked abut the first exchanges.

Sorted out a technical issue seamlessly – rerouting me to messenger.com - that was nice. It also linked to the Stanford clinical trial on the bot. comparing it with a non-bot intervention – although sample size is small, impressive. Also honest about the limitations of a bot – doesn’t overpromise.

You do get sucked into thinking it has human agency, even though it’s just coding, pre-scripting and maths. What’s strange is that most of the exchanges are single button presses – not dialogue at all but quite interesting, as they flip the counsellor, counselled role around. You are asking open questions, such as ‘How’, ‘Tell me more…’ ‘Oh’ ‘Sure’ ‘No doubt’ ‘Absolutely’.

Emojis are dropped in for variety and useful (at last), as they’re really are asking for an emotional response – that’s interesting and not easy to do F2F. The unlocked padlock emoji is nice as is the little sapling for hope and progress – sounds hokey, but it’s not.

What’s nice is that the interface is so simple and natural. You focus on what’s being said and asked and in this context, as you’re asked to think and reflect on your own feelings and behaviour - that’s useful. Dialogue is natural, easy and seems so very human.

The up-front promise of absolute anonymity is also good and I can see why this would appeal to people (I’d imagine the majority) who want help but are too shy or embarrassed to come forward. To be honest, I don’t want some random person counselling me… I want the distance.

The first lesson from woebot was to avoid the language of extremes – “all good”, “all bad”, “always” and to adopt a more measured language. All good… ooops!

One small thought here, I’d have liked this as audio. I’m working with a tool that allows learners to input answers by voice – it’s neat.

First session was 74 small exchanges and she said Bye. Speak again to tomorrow.

Day 2
Prompted me at 10.53, when I was active on Facebook. Asked politely if I wanted to continue. This time we’re onto multiple-choice questions about ‘all or nothing thinking’ and ‘should’ statements. Quite like the upbeat tone and lively feedback – seems appropriate in a session like this. I’m typing in more, rather than accepting responses – feels more like dialogue. Just 5 mins – small but sweet. I could get used to this.

Day 3
Had two days in London, so no time to do anything but woebot was patient.

“No worries, talk soon”. You have the option of continuing, rescheduling or waiting on the daily prompt. This, of course, is one of the great advantages of online counselling, indeed online anything, it’s 365/24/7. You do it when you feel like doing it, not when an expensive counsellor timetables you into their practice.

Day 4
Starts with asking me about my mood (emoji input from me). Gives me options
‘Work on stuff’, ‘Teach me’ or ‘Curated videos’ – not sure about these things – I don’t want to ‘wor’, want a ‘teacher’ or ‘curator’ – first really dissonant point. However, I fancied a video…

OK then.. here are some of my fav's:
1. Emotion Stress and Health (Crashcourse)
2. David Burns, MD TED
3. A video to help with sleep
4. Language is Important (featuring Me!)
5. Overcoming negative voices
6. Don't trust your feelings!
8. The worlds most unsatisfying video
9. Funny cats!
10. The importance of flattery

This led me, weirdly to Reggie Watts – I know him – hilarious and talented but this is a tangent, maybe not… but I felt like some fun…

Actually Reggie will really mess with your mind… he’s way out there… so I’m not sure how suitable that was to someone who really is on the edge…

Now a quick reflection here, a real, human therapist can’t really do this easily – direct you something really, rally interesting – you’re sort of stuck in dialogue.
Woebot says – see ya tomorrow – odd session – but fun.

Day 5
The whole thing is very upbeat, chatty…. Then it came up with SMART objectives – getting a bit of jargonish – not sure about this. Actually popped in a joke today – quite funny actually. SMART objectives – really? Getting a sense of CBT being a bit flakey – a bag of bad management technique marbles.

Day 6
That was good - tracking my mood…

Oh no it’s on to ‘mindfulness’ – but in for a penny, in for a pound of bullshit…
“Mindfulness is the opposite of mindfulness” it says, breaking its previous advice not to fall for the language of extremes…  Tried disagreeing with woebot here but it was having none of it – clearly not listening, in short, not mindful

Now a breathing exercise – 10 mindful breaths.

Day 7
Long quiz – not sure about this – far too long
Feedback – “Your greatest strength is your love of learning! You are just like Hermione Granger from ‘Harry Potter’”
That was hopeless – trite and I hate Harry Potter….

Day 8
Got a bit technical with ‘should statements’ – not so sure that this area of CBT is entirely clear – seems a bit simplistically linguistic.

Day 9
Asked me to talk about labels I use about myself – reasonable question – promises research tomorrow – didn’t like the way it cut this short – should allow me to go on if I want.

I think I prefer chatbots on-demand, like Replika, which you just tap on your phone to speak to. Replika is famous for teasing out the most intimate of thoughts from its 1.5 million users. It uses ‘cold-reading’ techniques from magicians, who claim to read minds.

Ellie’s another, created for DARPA. Designed to help doctors at military hospitals detect post-traumatic stress disorder, depression, and other mental illnesses in veterans returning from war, but is not meant to provide actual therapy, or replace a therapist. There is good evidence that people are more likely to open up to a bot than a person.

Day 10
Today is adopting a Growth Mindset. Good to see something a little bit more solid, as it reduces my general skepticism about therapeutic techniques, which seem to be a mixed bag of populist techniques almost thrown together…

Woebot wants to tell me a story to explain, I say yes… Story about woebot being told it was smart, believed it was smart but wasn’t really. This led to the wrong mindset – unable to cope with setbacks and failure. Fixed mindsets are bad so open yourself up to always learning and developing – be more open and fluid in your thinking. Be more accepting of setbacks and mistakes. Get out of polarized ‘smart v stupid labels. Then gave a link to a Carol Dweck video – good these video links. Good session.

Conclusion
It has its limitations and oddities but it’s good to chat to something that doesn’t judge you and has a few surprises up its sleeve. Woebot is a bit of fun, then again, I don’t feel I’m in need of help, many do. If I found it interesting, they are far more likely to get more out of the experience. You always have the chance of accepting, rescheduling or saying no to Woebot – which is useful. I’m often too busy or not in the mood for therapy but the fact that it is ‘pushed’ out to you is a real plus. I rather like its daily prompts – a bit reassuring and a bit of fun. Try it – you just might learn something – even about yourself.

Thursday, February 01, 2018

Online healthcare learning - in minutes not months

Healthcare is a complex business. So many things to learn, so much new knowledge to constantly master. The sector is awash with documents from compliance to clinical guidelines, all with oodles of detail and never enough time to train, retain and recall. As it is patients health, even lives, that matter, there’s little room for error. Yet so much training is still delivered via lectures and PowerPoint in rooms full of professionals who are badly needed on the front line. There must be a better way to deliver this regulatory and clinical knowledge?
Online learning is part of the solution but traditional online learning takes months to produce and even one 50 page clinical guideline is often prohibitively expensive. With this in mind, rather than use tools where most of the budget goes on graphics and not interaction, AI is producing tools that do this for you. One of those tools is WildFire, a service that creates high-retention in minutes not months at a faction of previous costs.
Sources
So far we’ve delivered a lot of content to a range of organisations from a range of pharmaceutical companies and a Royal College to the NHS. The content originated as:
·      Documents
·      PowerPoints
·      Podcasts
·      Videos
Easy input
With a modest amount of preparation, one takes the text file (or automatically created transcripts from podcasts and video) and cut and paste them into WildFire, which identifies what it thinks are the main learning points. Taking our lead from recent research in cognitive science, well summarised by researchers in Make It Stick, we focus not on multiple-choice questions (see weaknesses here) but open input, even voice, if desired. Open input is superior to MCQs as it results in better retention and recall.
Frictionless
Note that healthcare documents are often highly regulated, and the fact that we take the original document means we are not breaking that covenant. It also means almost no friction between designers and subject matter experts. The content has already been signed-off – we use that content in an unadulterated form.
Effortful learning
The learner has to literally type in the correct answers, identified by our AI engine. But we do much more. We also get the AI to identify links out to supplementary content. This is done automatically. This works well in healthcare, as the vocabulary, definitions and concepts can be daunting.
Chunking
We break the content down into small 10-15 minute learning experiences. This is necessary for focus as well as frequency of formative assessment. So a large compliance or clinical guideline document, such as a NICE Guideline, can be broken down meaningfully and accessed, as and when needed.
Competence
At the end of each pass through one of these short modules, your knowledge is assessed as Green (known), Amber (nearly known) or Red (not known). You must repeat the Ambers and Reds until you reach full 100% competence. This matters in healthcare. Getting 70% is fine but the other 30% can kill.
Curation
We don’t stop there. At the end of each module you can add curated content (again using AI) by searching for content directly related to the modules at hand from the selected learning points. This guided curation increases relevance. This is the stuff that you could know, as opposed to the stuff you should know.
Types of content
This is about moving from reading to retention. One clinical guideline may be intended for many audiences, clinicians, various healthcare professionals, carers, even patients. Updates can be delivered separately when they are published. In general, WildFire has been used for:
·      Peer-reviewed medical papers
·      Royal College clinical Guidelines
·      NICE Guidelines
·      Clinician in charge of trial podcasts
·      Question & answer session with experts
·      Clinician in charge of trial video
·      Nurse training videos
·      Patient videos
·      Training PowerPoints
·      Process documents
·      Compliance documents
·      Sales processes
·      Lots more….
Uses
What matters most is not that this learning content is useful but how it is used. We have delivered online learning prior to workshops and seminars, so that expensive F2F training can benefit from everyone being brought up to speed on the basic knowledge and vocabulary. Just as important is the post F2F experience of reinforcement and revision for exams, new jobs and so on. The content is far more successful when you know the context for delivery.
Conclusion

A full trial looking at speed of production, ese of use and learning efficacy has been done, and is avilable on request. So, if you have good assets that are not being used for learning, WildFire offers a way to get them into effortful, high retention and recall online learning, in minutes not months. To find out more or ask for ademo see here.