I shall eschew the usual New Year prognostications and Horizon
scans, alternatively known as statements of the bleedin’ obvious (mobile, cloud
etc.) mixed in with the idiosyncratic interests of their authors (talent
management, performance etc.) and bullshit (21st C skills,
mindfulness etc.). Instead I proffer just one prediction, in just two letters –
AI.
The term ‘Artificial Intelligence’ was coined in the year I
was born,1956, and after a series of over-optimistic claims, false starts and
dead ends we now see AI explode and embed in thousands of contexts, especially
online. You are unlikely to do anything online that does not utilise AI.
My son, who is doing a degree in AI, says, “I can’t think of one thing you can’t apply
AI to. Anything a human can do AI will be able to do and better”. When I
asked him if teaching could be replaced by AI, he laughed, “That one will be easy” he said, “…in fact it’s already here”. He went on
to explain that ‘search’ is the primary pedagogic technique for anyone wanting
to know almost anything from anywhere at anytime “that’s AI”.
With 2500 years of theorising it now has some serious winds
in its sails; maths. computer power, neuroscience, economics, cognitive
psychology, linguistics and control theory have come together to produce a
phenomenal cross-disciplinary effort. Add to that internet, with huge amounts
of users and gargantuan amounts of data - it is an unstoppable force. Forget
that backstop of PowerPoint futurists, the Gartner curve; this thing will just
grow and grow.
Brief history of AI
Much of early AI history comes from my own degree subject,
philosophy, starting with the colossal Aristotle, who kicked off logic with his
syllogistic work. Hobbes, Pascal, Leibniz and Descartes all contributed to
defining the mind as a reasoning machine. This was supplemented by a heavy dose
of philosophical empiricism from Bacon, Locke and Hume. We even have the
application of a calculus to ethics in Bentham and Mill.
In parallel, often overlapping, progress and significant leaps
were made in mathematics. Euclid’s algorithm for calculating the greatest
common divisors opened up the algorithmic approach to mathematics with
al-Khowarazmi (from whom we get the word algorithm) introducing algebra to
Europe. This eventually led to conceptual leaps in logic and computational
theory by Boole, Frege and Godel. Add to this advances in probability from
Cardano, Pascal, Fermat, Laplace, Bernoulli, and Bayes. This potent mixture of
algorithmic reasoning, computational theory and probability comes together in
the modern era, with a focus on algorithmic power of deep, universal algorithms
and machine learning.
Birthplace of modern
AI (1956)
The official birthplace of modern AI was my other alma mater
Dartmouth College, where, in the year of my birth, 1956, McCarthy organised a
two month study of AI. This was where I got my first introduction to the power
of programming and computers that led to a lifetime in the use of technology to
enhance human learning. 1956 was a turning point in other ways, with Kruschev’s
Stalin speech, a turning point for global communism, the Suez crisis marked the
demise of the British Empire, the first Islamic terror attack came in the
Milk-Bar in Algeria , Castro landed in Cuba and Elvis broke with ‘Heartbreak
Hotel’.
AI spring
After the AI winter, came the AI spring, when AI came together
in the late 1980s under a more pragmatic approach that adopted the scientific
method, where hypotheses were subjected to real experimental trials and
analysed statistically for significant results.
Web uplift
But what really took the field to a higher plane was the
enormous data thermal that was the web. It had billions of users, massive
amounts of data and the acceleration of computer power, storage and capability.
The trillions of words available in English, then other languages, led to rapid
advances in search, translation and an explosion of algorithmic services used
by billions of people worldwide.
Artificial General
Intelligence
The field has moved on from being a set of isolated areas of
activity and silos of research to a more pragmatic, scientific and problem
solving world. Artificial General Intelligence, the search for universal
learning algorithms that will work across different domains and in different
environments is now a real movement.
Age of the Algorithm
I’ve read a stack of books on this subject over 2014, from
academic textbooks to more populist works and each one just confirmed and
reinforced my view that we are now in the ‘Age of the Algorithm’. This is
formidable change in human history, where technology moves from becoming a
physical aid, even cognitive extension to enhancing and significantly improving
human performance. We are on the verge of something very big and very powerful,
where machines trump man. The consequences will be enormous and, as usual,
there will be forces for good and bad.
Conclusion
Importantly, we are now in a position of having a truly
cross-disciplinary field, fed by many disciplines where the power of hungry
algorithms is fuelled and tested by gargantuan amounts of data. The technology
is now small, fast and cheap enough to have taken AI into the consumer world,
where it is used by us all. Few of the services we use would work without this
huge, invisible, beneath the surface machine intelligence guiding, shaping and
often defining its delivery.
So, rather than skate across surface phenomena – MOOCs,
cloud, mobile etc, AI is my pick, not just for 2014 but for the next 50 years,
as the ‘Age of Algorithms’ is a deep, broad, paradigmatic shift that has
already delivered powerful pedagogic shifts (search, recommendation and
adaptive). If I were 20 again, this is where I’d be heading.
1 comment:
About time. We used to say "AI is the technology of the future and always will be". But times change. I look forward to it.
Post a Comment