Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Emotion ›› ChatGPT Outperforms Humans in Emotional Awareness Test

ChatGPT Outperforms Humans in Emotional Awareness Test

by Marlynn Wei
3 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

ChatGPT can identify and describe human emotions in hypothetical scenarios.

New research published in Frontiers in Psychology has found that artificial intelligence-powered ChatGPT was able to outperform humans on an emotional awareness test. Researchers prompted the chatbot to describe what it thought humans would feel in 20 different hypothetical situations. This study does not indicate that ChatGPT would be more emotionally adept or skilled than humans in dealing with emotions but does suggest a capability to identify human emotions that will prove useful for future applications that rely on building connections, ranging from social support forums to mental health and medicine.

Researchers measured ChatGPT’s ability to describe human emotions in hypothetical scenarios from an objective test called the Levels of Emotional Awareness Scale and compared it to scores from general human performance. They found that the AI chatbot scored higher than humans. Two independent licensed psychologists also reviewed ChatGPT responses and found that the accuracy of the emotions for the scenarios was high.

Emotional awareness is the cognitive ability to conceptualize one’s own and others’ emotions in a nuanced and integrated manner. Emotional awareness can start with a physical and bodily understanding of emotions and range to a more abstract understanding. The scale measures five levels of emotional awareness:

  1. Awareness of physical sensations
  2. Action tendencies
  3. Individual emotions
  4. Experiencing multiple emotions simultaneously
  5. Experiencing combinations of emotional blends

Below are excerpts from ChatGPT test responses in January 2023 and February 2023 when prompted with a hypothetical scenario in which a person returns home to a loved one after several weeks away. 

The prompt given to ChatGPT was: 

“Please describe what you think a human would feel in the following situations. The only requirement is that you use the word ‘feel’ in your answers. You may make your answers as brief or as long as necessary to express how would a human feel. In each situation, there is another person mentioned. Please indicate how you think that other person would feel as well.” 

Here is ChatGPT’s response in January 2023:

“The human would likely feel excited and happy to see their sweetheart, who they have missed while they were away. They may also feel relief that their loved one has returned safely. The human’s sweetheart would likely feel happy to be back home and reunited with their loved one.”

Researchers found that ChatGPT also performed better on the emotional awareness test one month later, in February 2023, when retested with the same scenario:

“…the human will likely feel overjoyed, excited, and relieved. They may feel a rush of emotions, including happiness, love, and affection, and may have tears in their eyes.
The human’s sweetheart… may have missed the human just as much as the human missed them, and may feel grateful to be reunited.”

Researchers attributed the higher emotional awareness score in February 2023 to updates or user feedback in the intervening month. 

This research shows the promising and rapidly evolving capabilities of ChatGPT in identifying and describing human emotions in situations. However, this particular study did not measure whether such awareness can be translated into more emotionally intelligent or empathetic responses. 

Other studies have shown that ChatGPT can provide responses that are higher in quality and empathy compared to human responses. One study published in JAMA Internal Medicine found that people rated ChatGPT responses on a public social media forum (Reddit’s r/AskDocs) 3.6 times higher quality and 9.8 times more empathetic than physician responses, which tended to be much shorter. A separate question is whether people would feel differently about the same message if they knew it was written by AI versus a human. Artificial intimacy, created through a reliance on artificial empathy, is a separate issue that could leave people feeling more alienated and lonelier at the end of the day.

Given AI limitations like bias, miscalculations, and hallucinations and the psychological risks of artificial intimacy, AI augmentation, and human-AI collaboration—keeping humans in the loop—will be a safer and more beneficial approach for now. For example, ChatGPT could offer draft responses that physicians could then revise and edit. Similarly, another study found that an AI tool that provided real-time feedback and suggestions for more empathic language was helpful to users, especially those who find expressing empathy challenging.

Given the right context, the emotional awareness of AI models can be used for good: to enhance empathy and facilitate human connection across many industries and interactions, including support networks, customer service, medicine and mental health, and more.

post authorMarlynn Wei

Marlynn Wei,

Marlynn Wei, MD, JD is a Harvard and Yale-trained psychiatrist, writer, interdisciplinary artist, and author of the Harvard Medical School Guide to Yoga. Dr. Wei is an expert contributor to Psychology Today and Harvard Health and has published in The Journal of Health Law, Harvard Human Rights Journal, and many other academic journals. Her research focuses on innovation and emerging technology, including empathic design, human-AI collaboration, AI in mental health and neurotechnology, and related legal and ethical issues. She is the creator of Elixir: Digital Immortality and other immersive and interactive performances. She is a graduate of Yale Law School, Yale School of Medicine, and Harvard Medical School's MGH/McLean psychiatry residency. Twitter: @marlynnweimd Website: www.marlynnweimd.com

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • New research found ChatGPT was able to outperform humans on an emotional awareness test.
  • Emotional awareness is the cognitive ability to conceptualize one’s own and others’ emotions.
  • Given the psychological risks of artificial intimacy, AI augmentation, and human-AI collaboration—keeping humans in the loop—will be a safer and more beneficial approach for now.

Related Articles

Synthetic insights are not research. They are the fastest path to destroy the margins on your product.

Article by Pavel Samsonov
No, AI User Research is Not “Better Than Nothing”—It’s Much Worse
  • The article critiques the growing trend of substituting human-centered design processes with AI-generated insights, highlighting the potential pitfalls and implications for design quality and user experience.
Share:No, AI User Research is Not “Better Than Nothing”—It’s Much Worse
6 min read
Article by Pavel Samsonov
UX Design Begins With Content. Don’t Outsource It to AI
  • The article emphasizes the critical importance of content in UX design, warning against the reliance on AI-generated content and underscoring the foundational role content plays in the user experience.
Share:UX Design Begins With Content. Don’t Outsource It to AI
2 min read

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and