Stages of 21st C grief
Predictably, the professional class, especially learning professionals, go through their five stages of grief, when faced with new technology:
Denial – bewildered they want to ridicule it through clickbait examples of how bad it is, then say block it and ban it
Anger – sabre rattling stage where mostly 3rd rate academics write open letters, warn people of the dangers which only ‘they’ perceive
Bargaining - form ethics committees, write frameworks and generally do their report writing thing
Depression – oh dear we do have to actually learn to work with this stuff
Acceptance – alright, I give in
We’re still in the ‘denial stage. Taking the usual refuge in the rhetoric of vague 21st C skills. But the mistake is to:
1) think these skills are new
2) think we have a clear definition
2) think of them as skills in themselves, as they mostly need domain knowledge
2) assume that educators have these skills, even if they did exist in a clean form
3) assume that they know how to teach these skills
4) assume that they are wholly unique to we humans
The whole thing goes back to the late 1990s, when we were faced with a new millennium. but a key document was "Partnership for 21st Century Skills," published in 2002 by a coalition of education organizations in the US. This report put great emphasis on the ‘c’ words and saw them as essential to prepare students for the challenges and opportunities of the 21st century.
Then the bandwagon, report-writing brigade, organizations such as the Organization for Economic Cooperation and Development (OECD) and the World Economic Forum (WEF) began pumping out the rhetoric highlighting the significance of 21st-century skills in their discussions on education and workforce readiness.
Mirjam Neelen & Paul A. Kirschner rightfully pointed out, in their excellent blog ‘21st Century Skills don't exist. So why do we need them? (2016) that there is nothing new about these skills, nothing 21st century about them. This was all laid out in the first major counter attack in 2010 from Rotherham & Willingham’s “21st-Century skills. Not new, but a worthy challenge.” I have been making the same point for the last 20 years.
Problem of definition
As they pointed out, the first problem, as expected, is definition. They often appear as a list of five or so things on PowerPoint, most starting with the letter ‘C’; communications, collaboration, critical thinking, creativity… then it tails off into the vagueness of problem solving and innovation. There are literally dozens of lists and variations.
Skills are notoriously difficult to pin down as they are not, as Bloom suggested represented in some simplistic pyramid They are complex sets of personality traits, motivations, learned knowledge and behaviour – it’s the sum of lots of integrated parts that results in an ability, competence or performance. The fact that it is complicated, mostly involving domain knowledge, makes these very vague 21st C skills difficult to extract as a separate ‘skill’. It is doubtful whether they can be abstracted from domain knowledge at all. Knowledge is marbled like fat into the meat of skills.
This 21st C agenda can also be quite dangerous. It can harm learners, especially the disadvantaged by giving them the illusion of progress. Knowledge and skills are not separable like oil and water, they are deeply entwined, so most critical thinking and problem solving requires a depth of knowledge. The 21st C skills agenda led to a hideous focus on ‘Leadership’ in L&D. We assumed that skills were intellectual in some way, so introduced this elitist, hierarchical training, at the expense of real skills and competences for all. We became exclusive not inclusive. That did not go well, as productivity did not increase and although we have lots of rhetoric and training on Leadership, we seem to have precious little of it.
AI hits the fan
We’re 23% into the 21st C and they have the last bastion of human exceptionalism (see last blog). The 21st C skills mantra, that has now turned into an empty trope in the Age of AI, especially among educators and learning professionals who needed a serious sounding phrase to rattle around in conferences and reports. It is usually to be found in some overlong, text-heavy Powerpoint presentation at a conference, accompanied by a cocktail of 'C' words - communication,, collaboration, creativity, critical thinking. Can the real world really be that alliterative?
Young people communicate every few minutes – it’s an obsession. They text, message, chat, post, comment, whatsapp, use Instagram, Facebook, Tik Tok and tools you're you may never to have heard of in various media, including images and video. Note the absence of email and Twitter, the only place you’re likely to hear of 21st C skills. This generation grew up in the 21st C. Never have so many communicated so often with so many. We readily forget that within a few generations, we have moved from letters to the telegraph, then telephone and now free communications, in a bewildering array of types and formats to almost anyone on the planet – for free. Technology not only increases the possibilities in communications, also the skills, through use.
In fact, one of the features of Generative AI, such as ChatGPT, is in showing us how poor our communications’ skills can be. It invariably improves almost anything we want to write and send.
There’s an abundance of collaboration online, where we share, tag, upload and download experiences, comments, photographs, video, media and now generative AI tools. We collaborate closely in teams, often international, when playing computer games and in the workplace. Never have we shared so much, so often, in so many different ways.
Then along comes someone who wants to teach this collaboration as a 21st C skill, usually in a classroom, where much of this technology is banned, ignoring how collaboration works in the real world. I’m hugely amused at this conceit, that we adults, especially in education, think we always have these skills. There is no area of human endeavour that is less collaborative than education. Teaching and lecturing are largely lone-wolf activities in classrooms. Schools, colleges and Universities share little. Educational professionals are deeply suspicious of anything produced outside of their classroom or their institution. The culture of NIH (Not Invented Here) is endemic. Many educators far from being consistently collaborative, are doggedly fixated with delivery by the individual.
With AI, especially generteive AI, collaboration at a global, human level, a huge, sophisticated and elegant collective model, trained on the sum of human knowledge. Generative AI is collaboration writ large, the whole of our collaborative cultural capital is being put to good use and available to the individual.
Critical thinking is the Little Bighorn of human exceptionalism, General Custer’s last stand. Academics, in particular, are fond of these two words, as they see themselves as having these skills in abundance. To be fair soft skills and empathy is not their strong point, so they need something to fall back on.
One species of critical thinking is often well taught in good schools and universities, but only at the level of research, text and essays. As a practical skill it has all but disappeared in theory dominated courses. It needs high quality teaching and the whole curriculum and system of assessment needs to adjust to this need. As Arun has shown, there is evidence that in our Universities, this is not happening. Arun (2011), in a study that tracked a nationally representative sample of more than 2,000 students, who entered 24 four-year colleges, showed that Universities were failing badly on the three skills they studied; critical thinking, complex reasoning and communications. This research, along with similar evidence, is laid out in their book Academically Adrift. Even research is now under threat as papers appear, where the hypothesis, data work, even the writing is being done by AI.
The idea that critical thinking is a uniquely human activity is being challenged by tools that do it better, whether in mathematics or other fields. It is now clear that the inference capabilities of AI will outclass humans and that we are no longer the sole producers of critical analysis. We saw how AI became highly competent at all human games. AI has already defined the 3D structure of the 200 million know proteins, a task that, according to Hassabis, the CEO of Deepmind, would have taken us a billion years using traditional human methods. It’s data analysis abilities have become superb and the growth of the inference side of AI is accelerating fast. We can expect current models top become good at this very quickly.
The modern understanding of the artist as a unique individual with a distinct creative vision began to emerge during the Renaissance in Europe, strengthened when the Romantic conception of the artist emerged in the late 18th and early 19th centuries as a reaction to rationalism and industrialization. Romanticism placed a strong emphasis on individualism, emotion, imagination, and the awe-inspiring power of nature. Within this context, the creative artist was seen as a visionary figure, capable of tapping into the depths of human experience and expressing profound emotions and ideas. Technology, of course, has always been part of artistic production and expression, from the tools of painting to photography, synthesizers and other generative tools. The idea that art is some sort of divine spark has long gone. Suddenly, generative AI was winning photography competitions, producing music and being generally ‘creative’, another word that is difficult to define.
Can they be independently taught?
Of course, those who are most vociferous on the subject of 21st C skills are often those who tend to ‘write’ most about them but are the least skilled in them. Education, whether through lectures in Universities or one teacher to 30 plus kids in a classroom, is not the way to teach any of these skills, even if they did exists. Of course, education continued to scrap vocational learning, starve it of funding and push for even more abstract schooling and higher education. So called 21st C skills were fine for others but not them.
Isn’t all this talk of 21st C skills just a rather old, top-down, command and control idea – that we know what’s best for others? Isn’t it just the old master-pupil model dressed up in new clothes? Do the pupils know a tad more about digital skills than the masters?
It is an illusion that these skills were ever, or even can be, taught at school. Teachers have enough on their plate without being given this burden. I’ve seen no evidence that teachers have the disposition, or training, to teach these skills. In fact, in universities, I’d argue that smart, highly analytic, research-driven academics tend, in my experience, often to have low skills in these areas. Formal educational environments are not the answer. Pushing rounded, sophisticated, informal skills into a square, subject-defined environment is not the answer. It is our schools and universities, not young people, who need to be dragged into the 21st century. The change will comes through mass adoption and practice, not formal education.
There’s a brazen conceit here, that educators know, with certainty, that these are the chosen skills for the next 100 years. Are we simply fetishising the skills of the current management class? Was there a sudden break between these skills in the last century compared to this century? No. What’s changed is the need to understand the wider range of future possibilities and stop relying on human exceptionalism.
Mirjam Neelen & Paul A. Kirschner ‘21ST CENTURY SKILLS DON’T EXIST. SO WHY DO WE NEED THEM?’ (2016). https://3starlearningexperiences.wordpress.com/2016/11/01/21st-century-skills-dont-exist-so-why-do-we-need-them/
Rotherham, A.J., & Willingham, D.T., (2010). “21st-Century” skills. Not new, but a worthy challenge. American Educator, 17-20. https://www.aft.org/sites/default/files/periodicals/RotherhamWillingham.pdf
Arum, R. and Roksa, J., (2011). Academically adrift: Limited learning on college campuses. University of Chicago Press.