Saturday, October 29, 2022

Ryan and Rigby - gamification and Self-Determination Theory


Ryan and Rigby

Richard M. Ryan, professor at the Institute for Positive Psychology and Education at the Australian Catholic University and a research professor at the University of Rochester.and Rigby, founder of Immersyve Inc. provide a theory of gaming built upon ‘Self-Determination Theory’, developed by Ryan and Deci, the idea that specific, deep needs explain what we require to live fulfilling lives.

They applied Self-Determination Theory to explain that intrinsic motivation was the key to understanding the power of games and gamification. Gaming has become wildly popular globally, as it provided autonomous action, optimised learning and connections with peer groups. Gaming has been researched by them in detail and they provide a compelling account of why games are so popular.

Self-determination theory

Edward Deci and Richard Ryan, put forward Self Determination Theory in their book Self-Determination and Intrinsic Motivation in Human Behavior (1985). It sees the active self, being in control, as the primary driver behind growth and fulfillment. It is essentially a theory of motivation, placing importance on intrinsic, not extrinsic motivation. It is your own need for growth that drives other personal needs. This means growing in competence as well as feeling related or connected to other people.

Self-determination theory has three components:

Autonomy - being in control, able to take action that results in actual change

Competence - learning knowledge and skills to achieve more autonomy

Connection or relatedness - feeling attached to other people

Gaming and SDT

It was this theory that Ryan and Rigby applied to games and gamification. Self Determination Theory (SDT) claims that autonomy, competence and relatedness are the three ways to true fulfillment and growth but are they also the reason why games are so popular?

They argue in The motivational pull of video games: A self-determination theory approach (2006) that gaming is fundamentally about meeting basic needs. Matching challenges to stretch abilities, optimal challenges, allow the gamer to progressively acquire competences, mastery through action, which they experience virtually. In Glued to Games (2011) they further suggest that superficial narratives are less important than these deeper psychological needs and experiences. 

Ryan and Rigby did four studies confirming that computer games offered autonomy within games and this was a key component in the enjoyment of the game. Competence was also studied and, as competence, in terms of knowledge and skills and achieved competences, manifested in actual performance up through levels in games, is a primary feature of gaming. Connection was also a strong feature in multiplayer games, a sense of being part of a team and wider community, within that one game, across all players of that game and the gaming community in general.

Rigby and Ryan studied actual motivations. consequences and game interventions. They claim that it is not the ‘content’ of games that matter but the feelings and achievements through playing the game. Unlike books, TV programmes and movies, it is not the narrative or story but the player interactions that matter. Game players get their satisfaction from the actions they take, not ‘fun’ as some assume. Hardcore gaming can be far from fun as game players will endure intense periods of effort, frustration, even disappointment but the satisfaction comes through the gameplay. This gameplay is the sense of control, achievement and relatedness to others (especially in multiplayer games). After playing, engagement in fan chat, videos and streaming, shows that the social side is very important. They go beyond the game to discuss, create and mod games.

All of this confirms their view that SDT explains the huge popularity and success of computer games. Rigby and Ryan give a solid explanation for the huge success of games, beyond mere entertainment, differentiating computer games from other media.

This has even more explanatory power in newer genres of games. Hundreds of millions play Fortnite, Minecraft or Roblox games because they give you the opportunity to create those games, confirming a strong sense of autonomy. You then succeed in killing, surviving, getting somewhere or gaining something, confirming your learned competences, in your own self but also in the eyes of others. The trajectory of a game is in the deep game design, in keeping you going with achievable challenges and satisfying these primal needs.

Gaming, learning and SDT

Note that SDT contains an important word - the acquisition of ‘competences’. This is central as one must feel good about gaining and exercising gained abilities in a range of different contexts. They provide a well-researched and sound basis for the power of games including the need to learn. Games gain their power, in a sense, by being learning experiences, becoming more competent, namely learning.

So, to learn best one must feel in control, set your own objectives, and also be in control oneself, whether playing a game or learning, sometimes both. How do you help people learn? You situate them in the context in which they will be autonomously motivated to learn and become more competent by overcoming difficulties and learning from failure, then connecting with your peer group within that multiplayer game, players of the game in general or gamers in general. 


Although this provides a general theory of learning, as well as an explanation as to why gaming and gamification may be useful in the delivery of learning, one must be careful in assuming this means games are always good in learning. Their focus on intrinsic, Self-Determination Theory, looks for deeper aspects of gamer motivation and games design, not superficial or simplistic fun.

Games may distract from actual learning by providing opportunities to learn how to play the game, rather than pick up required knowledge and skills. The rules and execution of the game mechanics may take up valuable cognitive load, thereby inhibiting useful learning.


SDT has influenced learning theory and those who see gaming as a useful way to deliver learning. Naruda sees it as important in framing a vision of the Metaverse providing a pull for meaningful activity in virtual worlds, satisfying deep and identified needs of autonomy, competence and relatedness.


Ryan, R.M., Rigby, C.S. and Przybylski, A., 2006. The motivational pull of video games: A self-determination theory approach. Motivation and emotion, 30(4), pp.344-360.

Przybylski, A.K., Rigby, C.S. and Ryan, R.M., 2010. A motivational model of video game engagement. Review of general psychology, 14(2), pp.154-166.

Rigby, S. and Ryan, R.M., 2011. Glued to games: How video games draw us in and hold us spellbound: How video games draw us in and hold us spellbound. AbC-CLIo.

Naruda, H. 2022 Virtual Society

Friday, August 12, 2022

Reality+.... minds will be blown

Reality + is a unique book from a unique thinker. David Chalmers made his name in philosophy, with Andy Clark, on seeing the mind or consciousness as extending further than we think – what many now know as Extended Reality. I included them both in my series of 200+ Learning Theorists.

Here, Chalmers tilts his lance at what most regard as fake knights, but is quixotic enough to see both windmills and knights as ‘real’. A modern Descartes he pushes us towards a view that every virtual world is a new reality – hence the title Reality+.

More than this, he thinks that such virtual worlds can be as good as the world we think we know. He takes his considerable conceptual sword to that other pub philosophy topic, whether what we are living in is a simulation. The results may surprise you.

This is not a casual read, philosophy never is, nor is it a short read at 462 pages. Nevertheless, it is a key text for anyone interested in the big ideas behind VR, AR, AI and the Metaverse. Another word of warning, a lot of it is counterintuitive, philosophy usually is but that is what makes it so exciting, so stick with it! The good news is that he is a good writer, way better (also philosopher) than Bostrom. He also lightens the saddle by bringing in science fiction, movies and games.

In presenting the idea that virtual minds are genuine minds, he takes us through a philosophical journey from Socrates, Plato and Aristotle, through to Descartes, Berkeley, Hume, Kant, Bentham, Mill and Frege to more modern philosophy of Moore, Rawls, Nozick, Putnam, Dennett and Baudrillard, cutting deeply into problems such as ‘What is real?’ He doesn’t shy away from ethics, not the superficial twaddle one sees most of the time but actual moral philosophy.

This is a book that matters, as the time is right for a deeper philosophical text around these issues. It is not a book for the beach, it is a book that will make you think deeply about technology and what is about to become a reality.

Thursday, August 11, 2022

Coffee and learning

What is the most widely consumed psychoactive drug in the world? Coffee. 

Hardly surprising but what effect does it have on your performance, memory or in preventing demenitia and Alzheimers? Beyond this does it have any other benefits? What was and is its role in learning?


Coffee and memory

We are still not entirely clear about how it works on the brain but a paper this month suggests that regular coffee drinking improves the signal-to-noise ratio during information encoding, in other words it improves memory and therefore retention.


In fact, there is now lots of evidence that coffee improves short-term memory and reaction times by acting on the pre-frontal cortex. Researchers from the University of Innsbruck in Austria, in a group of 15 volunteers given 100 mg of coffee, then scanned and tested, showed distinct improvements in memory in the caffeine fuelled group, "those who received caffeine had significantly greater activation in parts of the prefrontal lobe… These areas are involved in 'executive memory', attention, concentration, planning and monitoring."


A study from the University of Arizona, published a trial in Psychological Science, showed that in 40 participants, given 250 mg of coffee or decaffeinated coffee, the group that were given caffeine showed no decline in memory across the day in contrast to the decaffeinated group who showed significant decline.


In another French study researchers compared women aged 65 and older who drank more than three cups of coffee per day with those who drank one cup or less per day. Those who drank more caffeine showed less decline in memory tests over a four year period. The study, published in the journal of Neurology, raises the possibility that caffeine may also protect against the development of dementia.


Indeed, a trial from the University of Florida showing that coffee, more accurately caffeine, both prevents and reverses symptoms of Alzheimers in mice. Sure, mice trials don’t always transfer to humans, but these mice had the relevant human genes transferred. It suggests that caffeine both blocks and attacks the plaque that causes Alzheimers and memory loss. The University of Florida used 55 mice and gave one half doses of caffeine, similar to around five cups a day for humans, and the other half water. What was astonishing is that after two months the dementia mice had recovered their memories and were the same as the mice who showed no signs of dementia. The results were astonishing. What’s more, these mice had a 50% reduction in the beta-amyloid protein, which forms the plaque that causes brain dementia.


Coffee shops and learning

Since that first Yemeni goatherder observed his goats frolicking after eating coffee beans, coffee grew in popularity in the Middle East. It allowed Islamic Sufis to get through long nights of prayer and Islamic students found they could keep awake to recite and learn the Koran. The port of ‘Mocha’ eventually opened trade in the 16th century, through Constantinople, into Europe. As a habit of hospitality, it encouraged meeting socially and conversation, especially in an Ottoman Empire that did not drink alcohol.


Trade across the British Empire into the Levant, led to the first London coffee shop in 1652, right by the Royal Exchange. It was an immediate success and within ten years there were 83 coffee houses in London, fuelled by Puritan attitudes, where alcohol was frowned upon. By the late 17th century coffee shops charged a penny a cup and were called ‘penny universities’, as they were such powerful places of cross-disciplinary debate. Drinkers sat around one long table, so you sat next to and opposite people, which encouraged conversation. Pamphlets and newspapers were also available. By 1739, London had 551 coffee shops, many hives of intellectual and business activity. 


They became hothouses of political debate, which is why Charles II wanted to shut them down. Edward Lloyd’s coffee shop became Lloyds of London. Jonathon’s Coffee House in 1698 listed stock prices, which eventually became the London Stock Exchange. Similarly in New York, a coffee house became the New York Stock Exchange. Coffee houses were also seen as sharpening the wit. The Tatler (1709) and Spectator (1711) link the periodical with the coffee house, drawing content from that source. So we have then coffee stimulating thought and cultural output. Pepys goes to coffee house three or four times a day. Voltaire drank 40 cups a day!


Urban, corporate coffee chains, and now more artisanal small coffee shops, constitute a global revival. Starbucks and its imitators, picked up on the digital revolution, popular mobile laptop workers and students, offering free wifi. They have now become focal points for meetings and working. Many have people deep in thought, writing, coding, emailing and doing their jobs, stimulated by coffee and the general social environment of a warm and inviting place. WiFi in coffee shops has given them a real lease of life.


Quick and easy to make, with psychoactive qualities, coffee was also incorporated into the workplace, as a break from work but also a method of keeping awake and getting through the working day. At conferences we stop for coffee breaks, a social ritual which encourages social networking.


The drink of a coffee in the morning as a wake-up experience, then after meals to combat post-prandial lethargy, is now an established social ritual. Espressos, an Italian habit, came with the machine compression of water through coffee, making it quick. The in-out coffee culture developed because there was a tax on service, hence service at the bar, standing up.



Coffee has long fuelled learning, whether it be through the direct stimulation of the brain, increasing attention, improving memory, preventing dementia or providing a social context for debate and work. Coffee has more recently re-colonised the world, through a global coffee shop culture, in the workplace and at home. Fascinating drink, fascinating culture.

Friday, July 29, 2022

Futurelearn may not be a going concern

Futurelearn accounts read like the script of a disaster movie. Nearly 10 years I wrote a piece questioning their choice of CEO (he came from BBC Radio), knew nothing about this business and was saying some silly things….

In three years’ time we hope to be offering a level of online learning that we can’t dream about at the moment” says Simon, “It may sound ridiculous in ambition, but one of my team said to me that in five or 10 years, rather than hanging out on Facebook of an evening, people will feel they can hang around in the Futurelearn product.” 


It’s lack of business expertise, led it to hire to many BBC types at inflated salaries. This led to shallow technology, a failure to read actual demand (drift towards vocational) and an obsession with ‘social’ learning and poor technology. It has taken them a long time to hire a new CEO, after the company nearly collapsed, and to be fair, although he knows diddly squat about this industry, and is still bleating on about ‘social’ learning, he does have a solid business background. He’s clearly been brough in to turn it around.


But here’s the rub. It is a financial disaster. After ten years, virtually alone in the UK market, it has a whopping £16.1 million loss on £11.3 million revenues. With £15.7 million on salaries, it’s drowning, with its nose and lips are barely above water. After an investment to save it from disaster in only 2019, it needs £15 million to keep it going. The auditors had to send this cannonball over their bows….


a material uncertainty which may cast significant doubt about the group’s ability to continue as a going concern”.


All of this after the gift that was Covid. I'd be concerned if I were a partner. There are other fish in this ocean, who have managed to explode in revenues and pivot when necessary. Futurelearn, were never the 'future' and failed to 'learn'. They had an open goal back in 2012 but they were hidebound by being stuck in some British BBC time warp, with no real business acumen and no vision. That’s a shame.

Sunday, July 17, 2022

How Emotions are Made by Lisa Feldman-Barrett

This book came highly recommended and I have read Damasio, Panksepp and several other academics, who had looked at emotion in relation to learning, so was looking forward to the read. That hope was misplaced.

It was puzzlingly obtuse. She straight away attacks emotions as some sort of ‘essence’ produced in specific areas in the brain, unconnected to other cognitive processes, claiming they are ‘constructed’. Always a dangerously vague word ‘constructed’. Damasio, Panksepp and others have spent a long time giving us a taxonomy of such emotions, along with physiological evidence for their existence. She claims that there is no evidence for specific areas of the brain being tightly defined as the set of particular emotion but again this is a straw man. There is evidence from human lesions that emotions have identifiable brain structures, her averaging out on meta-studies is odd, and we know that many cognitive processes are widely distributed across the brain. That specific areas are difficult to find is not the ‘nail in the coffin’ argument she claims. Panksepp accuses her of using wide correlation and that the area has solid causal evidence for emotions being identifiable as innate and physiological phenomena.

Language and emotions

Then pops up the weird idea is that emotions need language. This is rather wild, reviving the Sapir-Whorf idea, that language affects cognition and that our perceptions and feelings are all relative to our spoken language. You know the story - Eskimos and snow. Pinker destroyed this idea in The Language Instinct and think about it, do you need the word‘jealousy’ to feel jealous, ‘anger’ to feel angry? Of course not. You can see how this linguistic necessity opens up a space for a theory that puts emotions on the cultural not physiological domain but that would be a caricature of human nature, a return to the sort of blank slate ideas that Pinker and others have been fighting for years. The idea that emotions are partly cues from language and not physiological systems is wrong-headed and common sense tell us so. There’s a reason men are more prone to violence than women, that is evidenced by the vast gender difference in such crimes and the ratio of men to women in our prison populations. There are different hormonal and physiological systems that play a huge part as causes.

The book therefore plays to the postmodern idea that language is all, that we don’t have emotions until we label them and can use words for them… but that is a category mistake. We have such words because we communally have the same or similar emotions, we don’t have emotions because we have the words. This is similar to the common confusion that the meaning of words come from the dictionary. Well no, the meaning of words are only in the dictionary because we all agree that this is what they mean in common usage. It is the other way round. We have common word for emotions because they exist, not because the words for them exist. One can feel jealous when one sees one’s girlfriend responding well when being chatted up by someone else. This is clearly an emotion separate from the use of the adjective ‘jealous’ or noun ‘jealously. We would feel it even when we had a cognitive impairment and forgot these two words. It is to confuse the signified with the signifier.

There is another a sematic trick here, hijacking the word ‘affect’, separating it from ‘emotion’, then using it as a bucket to fend off criticism. This is problematic. These semantic splits were evident in Hume, who wrote a lot on this topic, he describes passions, sentiments and taste, Panksepp has primary, secondary and tertiary emotions, there’s moods, feelings and attitudes – it’s complex, layered and subtle. You don’t have to be a Paul Ekland fanatic to see that his Basic Emotion Theory needed refinement (which is exactly what happened) but this constructional theory of emotion swings the pendulum so far in the opposite direction that it disappears from sight. For Barratt this language argument is the essence of her argument and if it falls, it all falls.

Mammals and emotions

Then there’s the curiously Cartesian idea that mammals don’t have emotions. My dog gets jealous when I pat other dogs and when I last checked he was not a fluent English speaker. There’s a reason we tend to have mammalian pets – they have emotions, something Barratt denies. By now I’m thinking, this is a return to the hideous Cartesian idea that animals have no emotions, as they have no language but read what Descartes did to animals, as a consequence of such ideas - it will make you retch. She relegates animals to the level of simple ‘affect’ producers of pleasure and pain. I wonder if she has a dog… she’d soon change her mind. Emotions clearly have an evolutionary past in higher-order mammals that didn’t just spring into action some tens of thousands of years ago when we acquired language. They are there in tool production, the marks we made half a million years ago, the sculptured objects. Tellingly, the book doesn’t explain how emotions came to be, how they arose in our brains, a point made by Westland (2021). The answer is clearly evolution as a set of advantageous and selected evolutionary traits. The idea that emotions and language came into being at the same time doesn’t have any aetiological, evolutionary explanation and she makes no effort to give one.

Throughout this I felt as if Barrett was jumping around, confusing different dimensions of emotion. First and foremost, their physiological manifestation as something we feel and experience (this is very real), then, layered on top of this, we use language to talk about them (which is something separate), then there are our external behaviours, beyond this how we read emotions in others (something separate again). Barrett wants to demote the former and claims that the latter are dominant. I don’t buy this.

Emotional expressions

Just as I thought things couldn’t get weirder, she claims that we learn to smile from TV and that it is a learnt behaviour not a signifier of emotion. Really? Blind people don’t smile? And yes she claims that smiling is a recent cultural phenomenon. I don’t think I’ve ever read a more depressing scientific idea. Nothing could be further from the truth. Barratt was criticised in Nature for ‘caricaturing’ other researchers’ findings and in the case of facial expressions she does precisely that. Sure, it is a complex area being explored in detail using face recognition software but the Barratt straw man argument surfaces again. Emotions are very real, depression is very real and their expressions, such as laughing and crying at a movie, very real. The idea that variation in emotions and their facial expressions vary does not entail social construction. Just because I can weep when sad and grieving but also when laughing until I cry, does not mean any of this is socially constructed. The rebuttal by Adolphs (2016) explains this well. To reduce them to social agreement, dependent on language, is a form of the essentialism she attacks, merely contemporary essentialism. To be fair she is not saying emotion doesn’t exist, only redefining what ‘real’ means. Her paper Emotions are real (2012) explains this and is worth reading, as the book How Emotions are Made (2017) lacks some of the core arguments. Emotions are biologically real and socially constructed at the same time. But she takes social construction to such as extreme that it becomes laughable. She distorts Searle and plucks out ‘category knowledge’ to build her theory, based in the unconscious, requiring social agreement, conveniently tucking it out of sight and mind. This is where credulity is stretched to breaking point.


The problem here is that it promotes a narrative that emotions are merely social and learned, opening up a space for people to come in with glib solutions, like emotional intelligence, as if all of this lies in the social realm. 

Intriguing ideas I admit but so many babies are thrown out with the bathwater, and the bath, that it becomes quite fantastical. Then there is the superfluous stuff in the book about emotion and the law and the sort of self-help stuff on sleep and diet. 

It is a book of the age, and will of course be eagerly consumed by those who want to believe that all is nurture not nature, and the lazy drift towards cultural, social and linguistic relativism, that everything is socially and linguistically constructed. This is why it’s loved - it is also precisely why it is largely wrong.


Barrett, L.F., 2017. How emotions are made: The secret life of the brain. Pan Macmillan.

Ekman, P., 1992. An argument for basic emotions. Cognition & emotion6(3-4), pp.169-200.

Panksepp J. Neurologizing the psychology of affects: How appraisal-based constructivism and basic emotion theory can coexist. Perspectives in Psychological Science. 2007;2:281–296

Westland, K 2021

Adolphs R., 2017. How should neuroscience study emotions? by distinguishing emotion states, concepts, and experiencesSocial Cognitive and Affective Neuroscience, Volume 12, Issue 1, January 2017, Pages 24–31,

Tyng, C.M., Amin, H.U., Saad, M.N. and Malik, A.S., 2017. The influences of emotion on learning and memory. Frontiers in psychology8, p.1454.

Wednesday, July 13, 2022

Pfeffer on Leadership BS

Jeffrey Pfeffer is the Thomas D. Dee II Professor of Organizational Behavior at the Stanford University Graduate School of Business. His interest in human resources, organisational theory and behaviour has led him to reflect on the nature of leadership and leadership training. He has written about evidence-base management in Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-based Management (2006) where he dismisses popular business wisdom in; leadership, strategy, change, talent, financial incentives, and work-life balance, often touted by consultants and training companies in favour of hard decision making based on data and facts.


In Leadership BS (2015), Pfeffer eschews what he sees as the usual platitudes in Leadership theory and training, for a more realistic view of the world as messy and complex. He exposes what he sees as the nostrums, stories, fictions, anecdotes, promises, glib simplicities, bromides, romanticism and myth-making feel-good nonsense that passes for Leadership training, his solution being realism. The aim is to reject the normative wishes with evidence and the realities of the workplace.

Unequivocally, he claims that the Leadership industry has not only empirically failed, with study after study showing workplace discontent, but also that it contributes to that failure. As the cult of leadership has risen, its perceived effectiveness has fallen. Bullying, stress, discontent are the norm and he presents a huge amount of evidence to show repeated failures in so called ‘leadership’. What he uncovers is an almost wilful avoidance of evidence, measurement and data. Despite the $20-$40 billion spend, the results are depressingly disappointing. He goes as far as suggesting that the very construct of leadership, as presented in much leadership consultancy and training, was invented as a simplification to deliberately obfuscate the real complexity of the workplace. 

Leadership training

His arguments against ‘Leadership training’ are pretty damning. Many who offer leadership consultancy and courses have never led anything and if they have their track record is rarely one of substantial success. In fact, he sees too many compensation consultants and linked to leadership industry and many with a woeful lack of actual expertise & knowledge. This leads to glib advice and recommendations that peddle inspiration not the realities of management. They often rely largely on storytelling and anecdote, and rarely include evaluation as part of the process (apart from primitive happy-sheet course data and self-evaluation). The leadership industry is therefore wholly unaccountable.

In the content he finds stories and anecdotes (as opposed to evidence) that are exaggerated, even fabricated. They also conveniently ignore actual successful leaders that don’t fit their neat model. These myths are counter-productive as they produce cynicism in employees. The rhetoric is not matched by actual action and behaviour. Worse, those who don’t conform to the out-dated leadership model don’t get promoted and may even get fired. Others, such as women and certain cultural minorities, that value modesty and collaboration, can also suffer. 

Leadership traits

A further critique centres around precise leadership qualities or. They are, he thinks, wrong-headed, as they focus on attributes not action and decision making. Given that the book was published in 2015, he was prescient in identifying Trump as a typical product of the charismatic leader cult. He played the leadership game and won. Pfeffer therefore punctures the idea that ‘modesty’ is an admired and effective leadership trait. He draws on Maccoby’s book The Productive Narcissist (2003), and his own evidence, to show that modesty, far from being a virtue, stops managers from thinking for themselves and being resilient in the face of adversity. It is energy, confidence and dominance that gets them where they are, not modesty. The Leadership industry may be holding back women and other potential managers by promoting false promises, such as modesty. He also accuses HR and talent management companies of being dishonest here in training for these qualities then recruiting the very opposite.

He also question that staple of leadership courses - authenticity - as a quality for leadership. He flips this to show that good managers need to do what people need them to do, not what they as managers simply want to do, not pander to their own views of themselves. Flight attendants, shop assistants, sales people and many others don’t operate by being totally ‘authentic’, neither do managers and leaders. He describes the “delicious irony” of leadership trainers who “train” people to be “authentic”, as if it is a trait that can be acquired in a classroom. Being authentic, is for Pfeffer, pretty much the opposite of what leaders need to be.

Much as trust would seem to be desirable in leadership, it may not be that simple. Bernie Madoff inspired ‘trust’. Trust, like faith, can lead one into real trouble. It may be desirable not to trust lawyers, competitors and politicking managers. True objectivity and realism may only be the result of not trusting everyone to tell the truth within an organisation, as you will be misled, even duped. You need to be on the mark, alert to deception, moves, protecting the organisation and that means distrusting some people.


Rich in real examples of leaders who were less than ideal, he shows how leadership training misses the mark most of the time – especially with the titans of tech; Steve Jobs, Bill Gates, Jeff Bezos and Larry Ellison. Political, sports and other leaders get a similar treatment. Most of the positive examples turn out to have serious flaws. So, when we look at what are called successful leaders, they turn out to be very different from what the leadership industry tells us. His recommendation is to get serious on the research, mainly what is effective, then hold so-called 'leaders' to account - not with happy-sheet nostrums but real accountability. It is not that he promotes immodesty, being inauthentic and telling lies, only to recognise that leaders and employees are people and that human nature always wins out. The remedy is to identify what you need from proposed leaders and then to make sure that they perform to those measures. This is where HR and remuneration committees fail. They pretend to be doing this when what they actually do is pander to an outdated cult of leadership, based on outdated concepts of the nature and value of leadership.


Pfeffer’s challenge is to recognise reality and accept that the workplace and people are much more complex than the feel-good training courses suggest. In reality, leaders’ behaviours are often at odds with those of the organisation. Their interests in terms of rewards, promotion and progress are often at odds with those they manage and even the organisations they lead. There is a lack of definition, theory and practice around the concept and it often distracts from the real needs in workplace learning.

He recommends that you:

·      Build your power base relentlessly (and sometimes shamelessly)

·      Embrace ambiguity . . .

·      When the situation demands change—adapt

·       Master the science of influence

It is not that leadership training is wrong, just that getting things done requires trade-offs and tough decisions. The danger is that organisations handicap themselves by training leaders to embrace utopian behaviours and avoid bold decisions, innovation and the realities or organisational growth. 


The fundamental problem outlined in Getting beyond the BS of leadership literature (2016) is to confuse ‘ought’ with ‘is’. Just because you think something ought to be the case doesn’t mean it is. In fact, conformation bias tends to produce the wrong solutions in this area, driven by moral and not organisational imperatives. The division of leadership into good and bad traits is a mistake, as it uses a problematic approach to human nature and ignores context. Quoting Machiavelli’s The Prince (1532), he says it is sometimes necessary to do bad things to achieve good results. Leaders need to be pragmatists.


Machiavelli, N., 2008. Machiavelli's the Prince: Bold-Faced Principles on Tactics, Power, and Politics. Sterling Publishing Company, Inc..

Maccoby, M., 2003. The productive narcissist: The promise and peril of visionary leadership. Broadway.

Pfeffer, J. and Sutton, R.I., 2006. Hard facts, dangerous half-truths, and total nonsense: Profiting from evidence-based management. Harvard Business Press.

Pfeffer, J., 2016. Getting beyond the BS of leadership literature. McKinsey Quarterly, 1, pp.90-95

Pfeffer, J., 2015. Leadership BS. HarperCollins.

Tuesday, July 12, 2022

Comenius (1592-1670 )

John Amos Comenius was a Czech educator who promoted ‘universal education’ to solve the problem of human and religious conflict. After the Lutherian Reformation, the idea of education for all took root. As a member of the Unity of Brethren, a group of Protestant reformers, he was educated within this system, as well as a receiving a Calvinist education in Herborn and Heidelberg. Living in turbulent times, in 1618 there was the defenestration of Prague and the start of the 30 Years War. Comenius had to keep on the move gathering an international reputation. He was a bridge between the Reformation and Enlightenment with the reforming zeal of the post-print, Reformation combined with the universal values of the Enlightenment. In addition he was a proponent of the Scientific Revolution, represented by the dedication of his book The Way of Light (1642), written in England and dedicated to The Royal Society, positing a universal college and network of schools working towards universal knowledge.


His Reformation spirit led him to imagine a ‘pansophism’, a universal wisdom, which teaches a unified knowledge, through a unified system of education. It is what we would call a universal curriculum, covering a wide range of knowledge, which is used to understand God’s world. Switching away from the classics, an obsession with grammar and rote learning to content that was sensitive to the motivation and interests of the learner, he saw print as the medium through which this could be achieved, providing universal access to universal content and learning. His universal and encyclopedic approach to pedagogy encouraged parents and teachers to constantly observe and explain the world to children but also to continue to learn themselves, an early proponent in lifelong learning. 

An important dimension of his pansophism was his desire to see universal access to learning, a truly universal, borderless education for the whole human race; rich, poor, male, female, rural, urban and importantly, the disabled – literally everyone.


The Door of Tongues Unlocked (1631) or Janua Linguarum Reserata was the first, followed by a series of other teaching or textbook books. These textbooks were revolutionary as short encyclopedias for children, an alternative to the traditional learning of Latin through grammar, rote learning and memorisation. It was one of the first ‘textbooks’ to teach language through a knowledge of the world and became a bestseller, an international publishing sensation. The idea was to lift education out of the divisive texts in religion, into a more universal orbit by publishing textbooks sensitive to the needs of learners.

His Orbis Pictus (1658) was the first textbook to use pictures to illustrate the content, connecting words to things. First published in German and Latin, it was subsequently published in many languages. He explains its pedagogic approach in the Preface and its 150 chapters start with the phonetics of language (surprisingly modern) to aid reading, then inanimate objects, botany, zoology, religion, humans and human activity. He recognised that pictures mattered, in this case woodcuts, meaningful and illustrative images, to keep the attention of the child. The images contain many objects and concepts, each numbered and related to the writer text. It had two or more columns, in the vernacular language(s) and Latin, and its pedagogic force came through the presentation of ideas in a new language, using objects, starting with familiar objects, gradually increasing in complexity, providing real world knowledge in a way that was motivating for all. It is a truly remarkable and forward-looking textbook.

These textbooks were designed for both teaching, by parents and teachers, as well as independent study. They were highly structured, with related images and text, and have been seen as precursors for later learning technologies in their design, pedagogy and aims.


Comedius had constructed a whole theory of education, published in Didactica Magna (1633-38) along with content, that appealed to the way in which people naturally learn. In that sense he was the precursor to Rousseau and Pestalozzi. But his influence was also as a practical teacher organising schools in several countries, even imagining the structure of modern day schooling from kindergarten to University. Rediscovered in the 19th C as an important figure in the history of education and pedagogy, many of his papers were only discovered well after his death and in the 1960s research into these documents enlarged his reputation.


Comenius, Johann Amos. 1673. The Gate of Languages Unlocked, or, A Seed-Plot of All Arts and Tongues: Containing a Ready Way to Learn the Latine and English Tongue. London: Printed by T.R. and N.T. for the Company of Stationers.

Comenius, Johann Amos. 1967. The Great Didactic of John Amos Comenius: Now for the First Time, tr. and ed. Maurice W. Keatinge. New York: Russell and Russell.

Comenius, Johann Amos. 1968. The Oribs Pictus of John Amos Comenius. Detroit, MI: Singing Tree.

Small, Mary Luins. 1990. "The Pansophism of John Amos Comenius (1592–1670) as the Foundation of Educational Technology and the Source of Constructive Standards for the Evaluation of Computerized Instruction and Tests." International Conference on Technology and Education, March 1990. ERIC. ED325079, microfiche, 1–11.

Sunday, July 10, 2022

Kellerman - The End of Leadership?

Barbara Kellerman is a Professor at Harvard’s Kennedy School of Government. Long a critic of most approaches to leadership theory and training, she focuses rather on what leadership is not, with more of a focus on what she calls ‘followership’. She has written many books on the topic, from Bad Leadership (2004) to The End of Leadership (2012), has won many awards and is an accomplished and prolific, international speaker.

History of leadership

Critical of the recent 40 year focus on leadership, she is critical of those who ignore the history of leadership, from Plato onwards. She points to his view that someone should only become a leader at 50 or above, as lived experience is often critical. Calling on Machiavelli, Locke, Hobbes, Montesquieu, Marx and others, she is surprised that this rich set of reflective views from history are ignored in modern leadership scholarship. In particular, she is critical of the ideas that leaders in themselves are critical, the ‘Great Man (usually)’ theories of leadership that focus on qualities and traits, as if there was an essence of leadership that can be distilled and used as a potion or remedy. Hitler, she claims, did not kill 6 million Jews in the Holocaust. In fact, he killed not a single Jew, so reading history just in terms of biographies is a mistake. 

For Kellerman, the Pollyanna world of leadership literature and training is unreal. Power, authority and influence are real. Leaders pay themselves more and more, are glad to reduce costs and numbers of employees, shift to cheaper manufacturing and services abroad. Often referring to Putin, Xi Jinping, Erdogan and Trump as examples of leadership as the exercise of power. Never have we had so much attention, books, money spent and training on leadership – and so little of it. We must therefore look, like good historians, at context.

Leadership industry

She sees the ‘Leadership industry’ as a 40 year aberration, an industry that has dominated management training, to its detriment. In general, she is dissatisfied with the fixation on leadership and the focus on traits of leaders and leadership. Look around, she asks, and see if leadership has improved after 40 years of this focus on leadership and leadership training? The scholarly evidence for success is scant. Leadership, she thinks, needs to be seen as a system not a person, which is why she is sceptical about the leadership industry and its Pollyanna recommendations. It has become a money making proposition, a leadership-industrial complex. But it is complex in another sense, in that it cannot be taught easily and quickly.

Leadership ‘attribution error’

Critical of the leadership industry, countless courses, centres, workshops, books and rhetoric devoted to Leadership, she puts most of it down to an ‘attribution error’; the tendency to attributes all success and failures to leaders and leadership. This, she regards as a basic and na├»ve mistake but one that drives untold amounts of unnecessary spend in organisations on consultancy and training.

In fact, most of the sophisticated political developments have been about the devolution of power away from leaders to others. From the end of the 20th C into the 21st C this has continued with the devolution of power. Cultural change and therefore context, have rendered leaders less important.

Most of the books and research focus on good on traits for leaders and fail to focus on bad leadership. She proposes seven types of bad leader in Bad Leadership (2004): 

1.     Ineffective

2.     Rigid

3.     Intemperate

4.     Insular

5.     Corrupt

6.     Callous

7.     Evil

What we have to recognise is that things are often volatile, uncertain, complex, ambiguous, globalised and driven by technological change. These contexts are complex, so rather than focus on traits, look at the complexity of context as it is plural, proximate, distal and temporal. This is not about you, it is about change. 


We have become fixated and obsessed by leaders at the expense of followers and the common good. As we overestimate the power of leader, where we think lies power, authority and influence lies, we correspondingly underestimate the power and influence of followers. Yet there is no leader without a follower, leaders do not exist without followers. The focus is therefore on the wrong end of the problem. We have too many leaders, which is the cult of individualism not community, when what really matters is to be realistic about followers, again a complex issue, which she classifies into:






Leadership and followership are intertwined. In understanding the idea and varieties of followership, one therefore understands leadership. The idea that everyone is a leader is, for her, ridiculous, as they are in practice, mostly various types of follower. The Western trajectory, since the Enlightenment, has seen power and influence wane and become increasingly devolved, so we have a very different context, one where followership is dominant.


Interest in Kellerman’s ideas have grown in the light of recent political events, especially in the US. She saw a turning point with Nixon and Monica Lewinsky in 1998, which degraded attitudes towards leadership. Will we continue with these more recent political divides? Avoid the regression to autocracy? We are seeing serious cleavages in countries, as followers have had enough of their leaders, now seen as the elite and out of touch. We have seen this in the US, France, UK and many other countries. A big mistake in business schools is to see that leadership courses focus on the organisation not the common good. We need to move away from this fixation on leadership.


It is unfortunate that serious scholars, such as Kellerman and Pfeffer, are often ignored by scholars, consultants and trainers in leadership, as they offer a more sophisticated interpretation of the complexity of the issues, rather than the delivery of simple platitudes.

Along with Pfeffer at Stanford, Kellerman provides a refreshing and more sophisticated theory of leadership and followership that escapes the normal focus on traits. Leaders are much more weakly positioned than they used to be. Influence matters more. Hierarchies are more horizontal.


Kellerman, B., 2012. The End of Leadership. New York, NY: Harper Business.

Kellerman, B., 2008. How followers are creating change and changing leaders. Boston, MA: Harvard Business School.

Kellerman, B., 2004. Bad leadership: What it is, how it happens, why it matters. Harvard Business Press.

Kellerman, B., 1999. Reinventing leadership: Making the connection between politics and business. SUNY Press.

Kellerman, B., 1984. The political presidency: Practice of leadership. Oxford University Press, USA.