Sure to
make most educationalists and teachers splutter with indignation, robot
teachers (or at least teaching assistants) may not be far off. I wrote a piece Robot teachers five years ago, and have been following
to progress ever since. Teachers will always be with us but can robots
extend and enhance learning in the home and classroom? The
point, in the short term, is not to replace teachers but to replace the
teaching of SOME TEACHING TASKS. Cheap, consumer tech, with brilliant AI
and UX software point towards a future where robots will have the affordances
necessary for this role.
You already use
robots
Robots already play a major role in your world. At any given
time you are likely to be wearing, watching, listening to, eating, driving,
flying or using something that has been partly made by robots. They are already
your manufacturing servants. Why? In some tasks, they outperform humans in
precision, consistency, endurance, strength and speed. The lesson is clear.
When robots can perform a task more precisely, consistently and with greater
endurance and speed than humans, then, given reasonable manufacturing costs and
social acceptability, they will be used. Robots have now started to emerge in
education and learning.
One-to-one promise
In terms of learning theory that backs up the possibility of
robot teaching, let’s start with some research. Bloom is famous for his
taxonomy but, in my view, his more substantial contribution was his famous
paper, The 2 Sigma Problem where he compared
the effectiveness of the lecture (conventional), formative feedback lecture (mastery
learning) and one-to-one tuition (tutorial). Taking the straight lecture as the
mean, he found an 84% increase in mastery above the mean for a formative
approach to teaching and an astonishing 98% increase in mastery for one-to-one
tuition. Google’s Peter Norvig famously said that if you only have to read one
paper to support online learning, this is it. In other words, the increase in
efficacy for one-to-one, because of the increase in on-task learning is
immense. This paper deserves to be read by anyone looking at improving the
efficacy of
learning as it shows hugely significant
improvements by simply altering the way teachers interact with learners. Online
learning, in the widest sense of the word, promises what Bloom called
‘one-to-one learning’, whether it’s through self-paced structured learning,
scenario-based learning, simulations or informal learning. This points towards
the future of learning as being individual, personalized, one-to-one teaching,
if possible.
Human-all-too-human
So what about research that backs up the use of robots? Nass
& Reeves published a remarkable book The Media Equation: How People
Treat Computers, Television and New Media Like Real People and Places. They completed 35 experiments
to show that people confuse media with real life. This conclusion shows that we
have a disposition towards suspension of disbelief when interacting with most
media, to see media as having human dimensions. It’s hard-wired. The conclusion we can draw from this,
and I did when designing online learning programmes, is that learning
experiences, if delivered in a suitably human manner (not necessarily by real
humans) can be successful. The key is that learning experience must conform to
certain social and physical rules. If we can deliver learning as if it were
being delivered by a real person in a realistic way, it works. What Nass & Reeves researched was
the role of simple physical and social rules. We respond to politeness, avoiding
unnatural pauses, flattery, don’t like harsh judgments and so on. These are the
things that we expect in dialogue. If this can be created through robots, AI
and good UX, we will will have come a long way towards the one-to-one tuition
that Bloom shows is the most effective way to teach.
Design
There is one more dimension to robot
learning that matters – design. Donald Norman laid the ground rules for
successful technology, the touchstone being its ‘invisibility’. This is true ergonomically but also true of interfaces, which need to be
cognitively ergonomic. Norman saw our emotional responses to design in terms
of:
1
Visceral (appearance)
2
Behavioural (performance)
3
Reflective (memories and
experience)
Interestingly he thinks Americans value
2 more than 1&3, whereas Europeans, at least the cultural classes, value
1&3. He claims that different people buy things with different fuel
mixtures of the three emotions. Different companies design to different types
of emotions. Great companies, like Apple, deliver all three. As he explains in Living
with Complexity, it is not that technology delivers too much complexity.
The fact is, we live in a world of complexity, with complex technologies that
do complex things. Live with it – that’s reality. The enemy is not complexity,
it is dreadful design. Complexity needs to be tamed, masked or made invisible
with good design. This is precisely what robots are starting to deliver – great
design that makes the mechanics of learning invisible.
Robots meet AI
&UX
Most great technology is a combination of technology The iPhone is a cluster of
existing technologies where the sum is greater than the parts. Brian Arthur,
in The Nature of Technology, thinks that this is an essential
feature of technology, the coalescence of existing technologies to create
something exciting and new. This is what makes the confluence between robots
and AI so exciting.
With adaptive learning (AI in learning),
where what the robot delivers (using algorithms) is always in response to who
you are, what you know and what you do, we get somewhere towards the necessary
one-to-one, personalized and more human-like delivery necessary for robot
teaching. Remember also, that machine learning, the ability of the robot to
learn, even learn on the back of aggregated data from many learners from a
network of robots, could see rapid progress in effectiveness. These adaptive
systems are already here. I work with them in real educational institutions
with tens of thousands of learners.
Search, speech recognition, gesture
recognition, touch screens, text to speech, automatic translation – all of
these are now available and only getting better. They are being be embedded
into robots. Speech recognition is the big leap but some are also taking
advantage of touchscreen interfaces on the chest of the robot. With sensors,
movement, balance, vision and hearing, these robots are getting pretty serious.
Early
experiments
We’ve had simple robots that allow
teachers to teach from a distance in rural schools. The Nexus Academy of
Columbia allows teachers to Skype in and control this robot from their
computer, going up to student’s desks and checking work.
Robosem is
a Korean Robot used to teach English. It takes a hybrid approach, either using
a real teacher via teleconferencing or through autonomous, adaptive lessons that use speech recognition and motion
tracking. There are many of these examples but we are now seeing some serious
commercial activity.
NAO
Time to introduce NAO, now used in
education in 70 countries. You can teach NAO and NAO can teach you. This is
nowhere near the dream I’ve outlined above but it’s a start and it works. With
speech recognition, NAO will call you by your first name, teach your child
multiplication tables, wake you up, monitor your home if you’re out. NAO will
be able to recognize members of the family, your friends, judge moods, know
your preferences. This is just one of a slew of early robots that will
eventually find a role in education.
Pepper
Now let me
introduce Pepper with it’s four microphones, two color video
cameras, 3D sensor, touch sensors, bump sensors, lasers, sonar, and gyros
positioned in the head, body, arms, and legs, voice recognition and a
touchscreen on its chest, with wifi. This companion learns from its environment
and draws from it’s cloud-based, updated, master algorithms. It has emotion
recognition reading visual expressions and from speech tones. Price: $1900.
Applications
The ‘cute’
factor is the current focus, as these are aimed at young children but the
long-term goal is much bolder. I see the first advances in early literacy,
numeracy, simple programming, as well as second languages. I also see real applications for children with autism and other
differences in learning, where the pressure on parents is immense and the needs
often different from traditional classroom learning. Children
with autism have been shown to respond positively to synchronised behaviour
from a robot. This is used to move the child on to other types of social
interaction. Robots have being used to with autistic children using mimicry to
establish trust.
Looking to the future, check out this
amazing example of robots teaching other robots. Berlin
researcher Luc Steels, has developed the robots that learn, speak to each other
and speak back.
Beyond this we have Google investing
madly in AI and robotics, with Deepmind, created by a learning expert (here for some background to relevance to learning). Google was the obvious buyer for this type of company, as it thinks in
terms of deep, basic problems, looking for generic solutions. Google has the
eyeballs, brains and behavioural traits of billions of humans. They have the
data that matters. This means they can apply Deep Learning solutions on scale.
But they’re not the only game in town. IBM’s Watson is set to deliver AI on
tap, Microsoft are using AI in their services and the Chinese search giant
Baidu are all hiring, buying and experimenting.
Conclusion
As Stanford’s Nass & Reeves showed, we need teaching
and learning services to at least appear to be friendly, patient, efficient,
polite, relevant, relevant, personalised and…. well social and human. Parents
spend hundreds of millions on one-to-one tuition for their kids. Much, much
more is spent on teachers and teaching. Robots can give unlimited amounts of
attention and help, in ways no teacher, or even parent, can. Robots
don’t get hangovers, don’t take holidays, never discriminate on grounds of
gender, race or accent. They’re patient, scalable and consistent. Wouldn’t it be wise to
pay attention to something that takes at least some of the pain away?
No comments:
Post a Comment