We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Artificial Intelligence ›› ​6 Ways to Improve Psychological AI Apps and Chatbots

​6 Ways to Improve Psychological AI Apps and Chatbots

by Marlynn Wei
3 min read
Share this post on


Repetitiveness, complicated setups, and lack of personalization deter users.

A new study of an AI chatbot and smartphone app to reduce drinking shows that users do not like repetitiveness, lack of individualized guidance, and complicated setups.  The study interviewed users to find out what barriers caused people to abandon either. 

Apps and chatbots can deliver effective interventions to improve sleep, decrease alcohol use, and reduce anxiety and depression, but the challenge is keeping users on the app. Patterns of user engagement vary widely in terms of frequency, intensity, timing, and accessed features. 

Sustained user engagement is a key factor in the success of psychological apps and chatbots. The number of app installs can be high, but only a small percentage of users use mental health apps consistently over time. One study found that after one week, only 5 to 19% of users continued to use mental health apps. Even when content is helpful, dropout rates are high. 

Features that increase engagement are appealing visual design, easy navigation, goal setting, reminders, and feedback. New content and a supportive positive tone keep users coming back.

One way to measure user experience is using the mobile app rating scale (MARS) which examines dimensions of engagement, functionality, aesthetics, and information quality. Another method is conducting user experience interviews or looking at consumer reviews.

Researchers in a recent study conducted semi-structured user interviews and found that the top reasons users stopped engaging were due to technology glitches, notification issues, repetitive material, and a long or glitchy setup. With the AI chatbot, users were frustrated by repetitive conversation, lack of control over navigation, and delivery platform.

Here are six features to enhance user engagement of psychological AI apps and chatbots:

1. Make setup easy. Complicated and glitchy setup deters users. One participant in the study described how their data disappeared after reregistration was required. Informed consent is ethically necessary for apps and chatbots dealing with mental health personal data, but a streamlined setup is equally important. 

2. Offer tracking. Tracking is an important way to get people to interact with the app or chatbot regularly. More importantly, tracking raises awareness and can change behavior. Mindfulness calls this developing an “observer mind,” a powerful stress management skill and catalyst for change. For example, tracking the number of alcoholic drinks one has daily helps people realize automatic habits.

3. Provide personalized feedback and accurate insights. Individualized guidance based on one’s data helps people get feedback or insight into their patterns. Tracking data around anxiety levels and timing can help predict anxiety episodes and narrow down potential triggers. Accuracy is critical. One participant described that the app said that they had met their daily goal when they had not. This lack of accuracy reduces user confidence in the app.

4. Make interactions less repetitive. Overly scripted and repetitive bots are not welcome. Like therapy, the therapeutic alliance between the user and the conversational agent determines whether people return. Novelty and a positive tone make the interaction therapeutic.

5. Ensure notifications are customizable, accurate, and timely. Faulty or absent notifications can deter users. If the app is based on changing daily habits, the timing of daily reminders is essential.

6. Prioritize user agency and avoid bottlenecking navigation with an unwelcome bot. Users should be able to navigate to resources on their own, rather than be forced to interact with a bot. Users described being frustrated with having to go through a bot to get to basic features. One participant in the study described how it felt “strange” to have the bot constantly bothering them when they were working on a task. This is like Microsoft’s Clippy, which caused a lot of user frustration.

These features will make psychological AI apps and chatbots more effective. Integrating personalized feedback, high-quality dynamic conversations, and a smooth glitch-free set up will improve both user engagement and enjoyment.

post authorMarlynn Wei

Marlynn Wei,

Marlynn Wei, MD, JD is a Harvard and Yale-trained psychiatrist, writer, interdisciplinary artist, and author of the Harvard Medical School Guide to Yoga. Dr. Wei is an expert contributor to Psychology Today and Harvard Health and has published in The Journal of Health Law, Harvard Human Rights Journal, and many other academic journals. Her research focuses on innovation and emerging technology, including empathic design, human-AI collaboration, AI in mental health and neurotechnology, and related legal and ethical issues. She is the creator of Elixir: Digital Immortality and other immersive and interactive performances. She is a graduate of Yale Law School, Yale School of Medicine, and Harvard Medical School's MGH/McLean psychiatry residency. Twitter: @marlynnweimd Website: www.marlynnweimd.com

Ideas In Brief
  • Personalized feedback, high-quality dynamic conversations, and a streamlined setup improve user engagement.
  • People dislike an overly scripted and repetitive AI chatbot that bottlenecks access to other features.
  • Tracking is a feature that engages users and develops an “observer mind,” enhancing awareness and change.
  • New research shows that users are less engaged in AI apps and chatbots that are repetitive, lack personalized advice, and have long or glitchy setup processes.

Related Articles

Use generative-AI tools to support and enhance your UX skills — not to replace them. Start with small UX tasks and watch out for hallucinations and bad advice.

Article by Kate Moran
AI for UX: Getting Started
  • The article delves into the urgent need for UX professionals to embrace AI, outlining tools, applications, and considerations.
  • The authors emphasize:
    • starting small;
    • gaining hands-on experience;
    • the symbiotic relationship between AI and human judgment in enhancing user experiences.

Share:AI for UX: Getting Started
16 min read

New ML models emerge hourly but this fast pace comes with drawbacks; hypothesis-driven development can help to mitigate those.

Article by Benjamin Thurer
The Inflation of AI: Is More Always Better?
  • The article challenges the need for the rapid growth of AI and ML models, highlighting risks and advocating for hypothesis-driven development to ensure meaningful contributions, enhance quality, and foster innovation in the AI community.
Share:The Inflation of AI: Is More Always Better?
7 min read

Users attribute human-like qualities to chatbots, anthropomorphizing the AI in four distinct ways — from basic courtesy to seeing AI as companions.

Article by Sarah Gibbons
The 4 Degrees of Anthropomorphism of Generative AI
  • The article delves into a qualitative usability study of ChatGPT, uncovering degrees of AI anthropomorphism in user behaviors.
  • The authors identify four levels — Courtesy, Reinforcement, Roleplay, and Companionship — providing insights into how users interact with and perceive generative AI.

Share:The 4 Degrees of Anthropomorphism of Generative AI
8 min read

Did you know UX Magazine hosts the most popular podcast about conversational AI?

Listen to Invisible Machines

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and