Showing posts sorted by relevance for query My tech prediction for 2015 - two small letters…. Sort by date Show all posts
Showing posts sorted by relevance for query My tech prediction for 2015 - two small letters…. Sort by date Show all posts

Monday, December 29, 2014

My tech prediction for 2015 - two small letters…

I shall eschew the usual New Year prognostications and Horizon scans, alternatively known as statements of the bleedin’ obvious (mobile, cloud etc.) mixed in with the idiosyncratic interests of their authors (talent management, performance etc.) and bullshit (21st C skills, mindfulness etc.). Instead I proffer just one prediction, in just two letters – AI.
The term ‘Artificial Intelligence’ was coined in the year I was born,1956, and after a series of over-optimistic claims, false starts and dead ends we now see AI explode and embed in thousands of contexts, especially online. You are unlikely to do anything online that does not utilise AI.
My son, who is doing a degree in AI, says, “I can’t think of one thing you can’t apply AI to. Anything a human can do AI will be able to do and better”. When I asked him if teaching could be replaced by AI, he laughed, “That one will be easy” he said, “…in fact it’s already here”. He went on to explain that ‘search’ is the primary pedagogic technique for anyone wanting to know almost anything from anywhere at anytime “that’s AI”.
With 2500 years of theorising it now has some serious winds in its sails; maths. computer power, neuroscience, economics, cognitive psychology, linguistics and control theory have come together to produce a phenomenal cross-disciplinary effort. Add to that internet, with huge amounts of users and gargantuan amounts of data - it is an unstoppable force. Forget that backstop of PowerPoint futurists, the Gartner curve; this thing will just grow and grow.
Brief history of AI
Much of early AI history comes from my own degree subject, philosophy, starting with the colossal Aristotle, who kicked off logic with his syllogistic work. Hobbes, Pascal, Leibniz and Descartes all contributed to defining the mind as a reasoning machine. This was supplemented by a heavy dose of philosophical empiricism from Bacon, Locke and Hume. We even have the application of a calculus to ethics in Bentham and Mill.
In parallel, often overlapping, progress and significant leaps were made in mathematics. Euclid’s algorithm for calculating the greatest common divisors opened up the algorithmic approach to mathematics with al-Khowarazmi (from whom we get the word algorithm) introducing algebra to Europe. This eventually led to conceptual leaps in logic and computational theory by Boole, Frege and Godel. Add to this advances in probability from Cardano, Pascal, Fermat, Laplace, Bernoulli, and Bayes. This potent mixture of algorithmic reasoning, computational theory and probability comes together in the modern era, with a focus on algorithmic power of deep, universal algorithms and machine learning.
Birthplace of modern AI (1956)
The official birthplace of modern AI was my other alma mater Dartmouth College, where, in the year of my birth, 1956, McCarthy organised a two month study of AI. This was where I got my first introduction to the power of programming and computers that led to a lifetime in the use of technology to enhance human learning. 1956 was a turning point in other ways, with Kruschev’s Stalin speech, a turning point for global communism, the Suez crisis marked the demise of the British Empire, the first Islamic terror attack came in the Milk-Bar in Algeria , Castro landed in Cuba and Elvis broke with ‘Heartbreak Hotel’.
AI spring
After the AI winter, came the AI spring, when AI came together in the late 1980s under a more pragmatic approach that adopted the scientific method, where hypotheses were subjected to real experimental trials and analysed statistically for significant results.
Web uplift
But what really took the field to a higher plane was the enormous data thermal that was the web. It had billions of users, massive amounts of data and the acceleration of computer power, storage and capability. The trillions of words available in English, then other languages, led to rapid advances in search, translation and an explosion of algorithmic services used by billions of people worldwide.
Artificial General Intelligence
The field has moved on from being a set of isolated areas of activity and silos of research to a more pragmatic, scientific and problem solving world. Artificial General Intelligence, the search for universal learning algorithms that will work across different domains and in different environments is now a real movement.
Age of the Algorithm
I’ve read a stack of books on this subject over 2014, from academic textbooks to more populist works and each one just confirmed and reinforced my view that we are now in the ‘Age of the Algorithm’. This is formidable change in human history, where technology moves from becoming a physical aid, even cognitive extension to enhancing and significantly improving human performance. We are on the verge of something very big and very powerful, where machines trump man. The consequences will be enormous and, as usual, there will be forces for good and bad.
Conclusion
Importantly, we are now in a position of having a truly cross-disciplinary field, fed by many disciplines where the power of hungry algorithms is fuelled and tested by gargantuan amounts of data. The technology is now small, fast and cheap enough to have taken AI into the consumer world, where it is used by us all. Few of the services we use would work without this huge, invisible, beneath the surface machine intelligence guiding, shaping and often defining its delivery.

So, rather than skate across surface phenomena – MOOCs, cloud, mobile etc, AI is my pick, not just for 2014 but for the next 50 years, as the ‘Age of Algorithms’ is a deep, broad, paradigmatic shift that has already delivered powerful pedagogic shifts (search, recommendation and adaptive). If I were 20 again, this is where I’d be heading.

Tuesday, February 05, 2019

2019 predictions in L&D... some surprising disappearances...

Great survey from my friend and namesake Donald Taylor. We are sometimes confused (in both senses of the word), but when it comes to what’s hot in workplace L&D in 2019, he’s the go to man. This is the 6thyear of his survey, by nearly 2000 professionals making 5332 votes from 92 countries.

Top three stars

  Personalisation/adaptive learning (1)
  Artificial Intelligence (2)
  Learning Analytics (3)
One could argue that all three of these top spots have been taken by AI. Sure there are aspects of personalized learning and analytics that are not AI, but it’s there, underlying all three top spots. I have spent the last four years saying that Artificial Intelligence is the major shift in learning technologies with a post in 2014, saying My tech prediction for 2015 - two small letters…AI. AI is changing the very nature of work, so it is ridiculous to imagine that it will not also change why, what and how we learn. Having started this journey in AI many years ago, four years ago I made an investment in an adaptive learning company, started my own AI company WildFire and began talking about this at conferences all over the world. To ignore this is to ignore reality and arguably the most important technology shift we've seen since the invention of print.

Three newbies
  Microlearning (5)
  Learning Experience Platforms (6)
  Performance support (11)
I’ve grouped these together as they show an interesting shift in thinking towards the more dynamic delivery of learning. I'd link the to the top three as chatbots and other forms of smart AI delivery are helping them get to learners in the workflow. My fear is that we'll get a fair bit of puff, as people replace the M with an X and deliver the same old stuff.

Three media

  Virtual/augmented reality (7)  
  Mobile delivery (8)
  Video (13)
Characterised by the fact that they’re actually hardware and media defined, they're here to stay. VR/AR is gaining ground, as I thought it would, and we have, at last, a way to deliver learning by doing. Mobile is, of course, everywhere and video is coming of age, as we’re seeing it better integrated into learning.

Three business topics

  Consulting more deeply with the business (9)
  Showing value (10)
  Developing the L&D function (15)
Although all three dropped 5,4 and 3 places respectively, they’re still in respectful positions and it's good to see the profession trying to keep business relevance and professionalism on the table. I'd like to see more attention to research and evidence but we're getting there.

Three bags full

  Collaborative learning (4)
  Neuroscience/cognitive science (12)
  Curation (14)
Collaborative learning is pretty solid, and it’s good to see that the science of learning is still in here. I still find it shocking that many practitioners have no idea what science says about learning and online learning. Lastly curation – bit of an oddball this one but it’s here.

Three goners

  Gamification
  MOOCs
  Badges
Gamification seems to have shot its bolt and disappeared. I think we got fed up with the weak side of gamification, playing Pavlov with learners, so it seems to have run its course. MOOCs have drifted away, more education that L&D – the numbers taking vocational MOOCS are phenomenal but this is not the world of L&D, it is the world of learners (oh the irony). Badges have also gone. That’s a shame but I too changed my mind on these and they seem to have had their day.
Conclusion
Once again, a great insight into how people are thinking. Over the years this has been a pretty good guide to what’s rising, staying around and falling. Well done to Donald Taylor and his team.