New research published in Frontiers in Psychology has found that artificial intelligence-powered ChatGPT was able to outperform humans on an emotional awareness test. Researchers prompted the chatbot to describe what it thought humans would feel in 20 different hypothetical situations. This study does not indicate that ChatGPT would be more emotionally adept or skilled than humans in dealing with emotions but does suggest a capability to identify human emotions that will prove useful for future applications that rely on building connections, ranging from social support forums to mental health and medicine.
Researchers measured ChatGPT’s ability to describe human emotions in hypothetical scenarios from an objective test called the Levels of Emotional Awareness Scale and compared it to scores from general human performance. They found that the AI chatbot scored higher than humans. Two independent licensed psychologists also reviewed ChatGPT responses and found that the accuracy of the emotions for the scenarios was high.
Emotional awareness is the cognitive ability to conceptualize one’s own and others’ emotions in a nuanced and integrated manner. Emotional awareness can start with a physical and bodily understanding of emotions and range to a more abstract understanding. The scale measures five levels of emotional awareness:
- Awareness of physical sensations
- Action tendencies
- Individual emotions
- Experiencing multiple emotions simultaneously
- Experiencing combinations of emotional blends
Below are excerpts from ChatGPT test responses in January 2023 and February 2023 when prompted with a hypothetical scenario in which a person returns home to a loved one after several weeks away.
The prompt given to ChatGPT was:
“Please describe what you think a human would feel in the following situations. The only requirement is that you use the word ‘feel’ in your answers. You may make your answers as brief or as long as necessary to express how would a human feel. In each situation, there is another person mentioned. Please indicate how you think that other person would feel as well.”
Here is ChatGPT’s response in January 2023:
“The human would likely feel excited and happy to see their sweetheart, who they have missed while they were away. They may also feel relief that their loved one has returned safely. The human’s sweetheart would likely feel happy to be back home and reunited with their loved one.”
Researchers found that ChatGPT also performed better on the emotional awareness test one month later, in February 2023, when retested with the same scenario:
“…the human will likely feel overjoyed, excited, and relieved. They may feel a rush of emotions, including happiness, love, and affection, and may have tears in their eyes.
The human’s sweetheart… may have missed the human just as much as the human missed them, and may feel grateful to be reunited.”
Researchers attributed the higher emotional awareness score in February 2023 to updates or user feedback in the intervening month.
This research shows the promising and rapidly evolving capabilities of ChatGPT in identifying and describing human emotions in situations. However, this particular study did not measure whether such awareness can be translated into more emotionally intelligent or empathetic responses.
Other studies have shown that ChatGPT can provide responses that are higher in quality and empathy compared to human responses. One study published in JAMA Internal Medicine found that people rated ChatGPT responses on a public social media forum (Reddit’s r/AskDocs) 3.6 times higher quality and 9.8 times more empathetic than physician responses, which tended to be much shorter. A separate question is whether people would feel differently about the same message if they knew it was written by AI versus a human. Artificial intimacy, created through a reliance on artificial empathy, is a separate issue that could leave people feeling more alienated and lonelier at the end of the day.
Given AI limitations like bias, miscalculations, and hallucinations and the psychological risks of artificial intimacy, AI augmentation, and human-AI collaboration—keeping humans in the loop—will be a safer and more beneficial approach for now. For example, ChatGPT could offer draft responses that physicians could then revise and edit. Similarly, another study found that an AI tool that provided real-time feedback and suggestions for more empathic language was helpful to users, especially those who find expressing empathy challenging.
Given the right context, the emotional awareness of AI models can be used for good: to enhance empathy and facilitate human connection across many industries and interactions, including support networks, customer service, medicine and mental health, and more.