Sunday, June 25, 2023

Can machines have empathy and other emotions?

Can machines have empathy and other emotions? Yann Lecun thinks they can and I agree but it is a qualified agreement. This will matter if AI it to become a Universal teacher and have the qualities of an expert teacher.

 

One must start with what emotions are. There has been a good deal of research on this, by Krathwohl, Damasio & Immordino-Yang, Lakoff, Panksepp. Also good work done on uncovering the role of emotion in learning by Nick Shackleton-Jonesl also covered them all in this podcast.


We must also make the distinction between:


Emotional recognition

Display of emotion

Feeling emotions

 

Emotional recognition

The face is a primary indicator of emotions and we look for changes in facial muscles, such as raised eyebrows, narrowed or widened eyes, smiles, frowns, or clenched jaw. Facial scanning can certainly identify emotions using this route. Eye contact is another, a solid gaze showing interest, even anger, while avoiding eye contact can indicate disinterest, shyness, unease or guilt. Microexpressions are also recognisable as expressing emotions. Note that all of this is often a weakness in humans, with a significant difference between men and women, also in those with autism. Emotional recognition is well on its way to being better than most humans and will most likely surpass that ability. 

 

Vocal tone and volume are also significant, tone of voice, intonation, pitch, raised volume when aroused or angry; quiet or softer tone when sad or reflective; upbeat when happy. Body language is another, clearly possible by scanning for folded arms and movements showing unease, disinterest or anger.

 

Even at the level of text, one can use sentiment analysis to spot a range of emotions, as emotions are encoded in laguage. LLMs show this quite dramatically. This can be used to semantically interpret text that reveals a whole range of emotions. It can be used over time, for example, to spot failing students who show negativity in a course. It can be used at an individual level or provide insights into social media monitoring, public opinion, customer feedback, brand perception, and other areas where understanding sentiment is valuable. As it improves, using LLMs it is starting to spot It may struggle with sarcasm, irony and complex language usage.

 

AI already could understands music in some sense, even its emotional intent and effect. Spotify already classify using these criteria using AI. This is not to say it feels emotion.

 

Even at the level of ‘recognition, it could very well be that machine help humans control and modulate bad emotions. I’m sure that feedback loops can calm people down and encourage emotional intelligence. The fact that machines could read stimuli quicker than us and respond quicker, may mean it is better at empathy than we could ever be. Recognising emotion will allow AI to respond appropriately to our needs and should not be dismissed. It can be used as a means to many ends, from education to mental healthcare. Chatbots are already being used to deliver CBT therapy. 

 

Display of emotion

Emotions can be displayed without being felt. Actors can do this, written words in a novel can do this and both can elicit strong human emotions. Coaches do this frequently. Machines can also do this. From the earliest Chatbots, such as ELIZA, that has been clear, Nass &Reeves, showed in 35 studies in The Media Equation, that this reading of human qualities and emotions into machines is common.


As Panksepp repeatedly says we have a tendency to think of emotions as human and therefore ‘good’. Their evolutionary development means they are there for different reasons than we think, which is why they often overwhelm us or have dangerous as well as beneficial consequences. Most crime is driven by emotional impulses such as unpredictable anger, especially violent and sexual crime. This would lead us to conclude that the display of positive emotions should be encouraged, bad ones designed out of the system. There are already efforts to build fairness, kindness, altruism and mercy into systems. It is not just a matter of having a full set of emotions, mort a matter of what emotions we want these systems to display or have.

 

Feeling emotions

This would require AI to be fully embedded in a physical nervous system that can feel in the sense that we feels emotions in the brain. It also seems to require consciousness of the feelings themselves. We could dismiss this as impossible but there are half way houses here and there is another possibility. Geoffry Hinton has posited The Mortal Computer and hybrid computer brain interfaces could very well blur this distinction in a sense of integrating thought with human emotions, in ways as yet not experiences, even subconsciously. But we may not need to go this far.

 

Are emotions necessary in teaching?

I have always been struck by Donald Norman’s argument “Empathy… sounds wonderful but the search for empathy is simply misled.” He argued that this call for empathy in design is wrong-headed and that “the concept is impossible, and even if possible, wrong”. There is no way you can put yourself into the heads of the hundreds, thousands, even tens and hundreds of thousands of learners. As Norman says “It sounds wonderful but the search for empathy is simply misled.” Not only is it not possible to understand individuals in this way, it is just not that useful. It is not empathy but data you need. Who are these people, what do they need to actually do and how can we help them. As people they will be hugely variable but what they need to know and do, in order to achieve a goal, is relatively stable. This has little to do with empathy and a lot to do with understanding and reason.

 

Sure, the emotional side of learning is important and people like Norman, have written and researched the subject extensively. Positive emotions help people learn (Um et al., 2012). Even negative emotions (D’Mello et al., 2014) can help people learn, stimulating attention and motivation, including mild stress (Vogel and Schwabe, 2016). We also know that emotions induce attention (Vuilleumier, 2005) and motivation that can be described as curiosity, where the novel or surprising can stimulate active interest (Oudeyer et al., 2016). In short, emotional events are remembered longer, more clearly and accurately than neutral events.

 

All too often we latch on to a noun in the learning world without thinking much about what it actually means, what experts in the field say about it and bandy it about as though it were a certain truth. But trying to induce emotion in the teaching and design process may not be not that relevant or pnly relevant to the degree that mimicing emotion may be enough. AI can be designed to induce and manipulate the learner towards positive emotions and not the emotions, identified by Panksepp and others, that harm learning, such as fear, anxiety and anger. We are in such a rush to include ‘emotion’ in design that we confuse emotion in learning process with emotion in the teacher and designer. It also seems like lazy signalling, for not doing the hard analysis up front, defaulting to the loose language of concern and sympathy.

 

Conclusion

In discussion emotions we tend to think of it as a uniquely human phenomenon. It is not. Animals clearly have emotions. This is not a case of human exceptionalism. In other words, beings with less complexity than us can feel. At what point therefore can the bottom up process create machine that can feel? We seem to be getting there and have come quite far having reached ‘recognition’ and ‘display; 

 

If developments in AI have taught us one thing, it is to never say never. Exponential advances are now being made and this will continue, with some of the largest companies with huge investments, along with a significant shift in research and government intentions. We already have the recognition and display of emotions. The feeling of emotions may be far off, unnecessary for many tasks, even teaching and learning.

 


In medicine, empathy is already being helped with GPT4, patients can benefit from being helped by both a knowledgeable and empathetic machine. We see this already Healthcare in the Ayers (2023) research, where 79% of the time, patients rated the chatbot significantly higher for both quality and empathy. That’s before the obvious benefits of being available 24/7, getting quicker results, increased availability of healthcare in rural areas, access by the poor and decreased workload for healthcare systems. It empowers the patient. For more on this area of AI helping patients with empathy listen to Peter Lee’s excellent podcast here. He shows that even pseudo-empathy can run deep and be used in many interaction with teachers, doctors, in retail and so on.

This is why I think the Universal Teacher and Universal Doctor are now on the horizon.

 

Bibliography

Ayers et al. 2023. Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Platform

Norman, D.A., 2004. Emotional design: Why we love (or hate) everyday things. Basic Civitas Books.

Norman, D., 2019. Why I Don't Believe in Empathic Design.

Um, E., Plass, J.L., Hayward, E.O. and Homer, B.D., 2012. Emotional design in multimedia learning. Journal of educational psychology104(2), p.485.

D’Mello, S., Lehman, B., Pekrun, R. and Graesser, A., 2014. Confusion can be beneficial for learning. Learning and Instruction29, pp.153-170.

Vogel, S. and Schwabe, L., 2016. Learning and memory under stress: implications for the classroom. npj Science of Learning1(1), pp.1-10.

Vuilleumier, P., 2005. How brains beware: neural mechanisms of emotional attention. Trends in cognitive sciences9(12), pp.585-594.

Oudeyer, P.Y., Gottlieb, J. and Lopes, M., 2016. Intrinsic motivation, curiosity, and learning: Theory and applications in educational technologies. Progress in brain research229, pp.257-284.

https://greatmindsonlearning.libsyn.com/affective-learning-with-donald-clark 

No comments: