Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Artificial Intelligence ›› ​6 Ways to Improve Psychological AI Apps and Chatbots

​6 Ways to Improve Psychological AI Apps and Chatbots

by Marlynn Wei
3 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Repetitiveness, complicated setups, and lack of personalization deter users.

A new study of an AI chatbot and smartphone app to reduce drinking shows that users do not like repetitiveness, lack of individualized guidance, and complicated setups.  The study interviewed users to find out what barriers caused people to abandon either. 

Apps and chatbots can deliver effective interventions to improve sleep, decrease alcohol use, and reduce anxiety and depression, but the challenge is keeping users on the app. Patterns of user engagement vary widely in terms of frequency, intensity, timing, and accessed features. 

Sustained user engagement is a key factor in the success of psychological apps and chatbots. The number of app installs can be high, but only a small percentage of users use mental health apps consistently over time. One study found that after one week, only 5 to 19% of users continued to use mental health apps. Even when content is helpful, dropout rates are high. 

Features that increase engagement are appealing visual design, easy navigation, goal setting, reminders, and feedback. New content and a supportive positive tone keep users coming back.

One way to measure user experience is using the mobile app rating scale (MARS) which examines dimensions of engagement, functionality, aesthetics, and information quality. Another method is conducting user experience interviews or looking at consumer reviews.

Researchers in a recent study conducted semi-structured user interviews and found that the top reasons users stopped engaging were due to technology glitches, notification issues, repetitive material, and a long or glitchy setup. With the AI chatbot, users were frustrated by repetitive conversation, lack of control over navigation, and delivery platform.

Here are six features to enhance user engagement of psychological AI apps and chatbots:

1. Make setup easy. Complicated and glitchy setup deters users. One participant in the study described how their data disappeared after reregistration was required. Informed consent is ethically necessary for apps and chatbots dealing with mental health personal data, but a streamlined setup is equally important. 

2. Offer tracking. Tracking is an important way to get people to interact with the app or chatbot regularly. More importantly, tracking raises awareness and can change behavior. Mindfulness calls this developing an “observer mind,” a powerful stress management skill and catalyst for change. For example, tracking the number of alcoholic drinks one has daily helps people realize automatic habits.

3. Provide personalized feedback and accurate insights. Individualized guidance based on one’s data helps people get feedback or insight into their patterns. Tracking data around anxiety levels and timing can help predict anxiety episodes and narrow down potential triggers. Accuracy is critical. One participant described that the app said that they had met their daily goal when they had not. This lack of accuracy reduces user confidence in the app.

4. Make interactions less repetitive. Overly scripted and repetitive bots are not welcome. Like therapy, the therapeutic alliance between the user and the conversational agent determines whether people return. Novelty and a positive tone make the interaction therapeutic.

5. Ensure notifications are customizable, accurate, and timely. Faulty or absent notifications can deter users. If the app is based on changing daily habits, the timing of daily reminders is essential.

6. Prioritize user agency and avoid bottlenecking navigation with an unwelcome bot. Users should be able to navigate to resources on their own, rather than be forced to interact with a bot. Users described being frustrated with having to go through a bot to get to basic features. One participant in the study described how it felt “strange” to have the bot constantly bothering them when they were working on a task. This is like Microsoft’s Clippy, which caused a lot of user frustration.

These features will make psychological AI apps and chatbots more effective. Integrating personalized feedback, high-quality dynamic conversations, and a smooth glitch-free set up will improve both user engagement and enjoyment.

post authorMarlynn Wei

Marlynn Wei

Marlynn Wei, MD, JD is a Harvard and Yale-trained psychiatrist, writer, interdisciplinary artist, and author of the Harvard Medical School Guide to Yoga. Dr. Wei is an expert contributor to Psychology Today and Harvard Health and has published in The Journal of Health Law, Harvard Human Rights Journal, and many other academic journals. Her research focuses on innovation and emerging technology, including empathic design, human-AI collaboration, AI in mental health and neurotechnology, and related legal and ethical issues. She is the creator of Elixir: Digital Immortality and other immersive and interactive performances. She is a graduate of Yale Law School, Yale School of Medicine, and Harvard Medical School's MGH/McLean psychiatry residency. Twitter: @marlynnweimd Website: www.marlynnweimd.com

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • Personalized feedback, high-quality dynamic conversations, and a streamlined setup improve user engagement.
  • People dislike an overly scripted and repetitive AI chatbot that bottlenecks access to other features.
  • Tracking is a feature that engages users and develops an “observer mind,” enhancing awareness and change.
  • New research shows that users are less engaged in AI apps and chatbots that are repetitive, lack personalized advice, and have long or glitchy setup processes.

Related Articles

Discover the hidden costs of AI-driven connectivity, from environmental impacts to privacy risks. Explore how our increasing reliance on AI is reshaping personal relationships and raising ethical challenges in the digital age.

Article by Louis Byrd
The Hidden Cost of Being Connected in the Age of AI
  • The article discusses the hidden costs of AI-driven connectivity, focusing on its environmental and energy demands.
  • It examines how increased connectivity exposes users to privacy risks and weakens personal relationships.
  • The article also highlights the need for ethical considerations to ensure responsible AI development and usage.
Share:The Hidden Cost of Being Connected in the Age of AI
9 min read

Is AI reshaping creativity as we know it? This thought-provoking article delves into the rise of artificial intelligence in various creative fields, exploring its impact on innovation and the essence of human artistry. Discover whether AI is a collaborator or a competitor in the creative landscape.

Article by Oliver Inderwildi
The Ascent of AI: Is It Already Shaping Every Breakthrough and Even Taking Over Creativity?
  • The article explores the transformative impact of AI on creativity, questioning whether it is enhancing or overshadowing human ingenuity.
  • It discusses the implications of AI-generated content across various fields, including art, music, and writing, and its potential to redefine traditional creative processes.
  • The piece emphasizes the need for a balanced approach that values human creativity while leveraging AI’s capabilities, advocating for a collaborative rather than competitive relationship between the two.
Share:The Ascent of AI: Is It Already Shaping Every Breakthrough and Even Taking Over Creativity?
6 min read

Discover how GPT Researcher is transforming the research landscape by using multiple AI agents to deliver deeper, unbiased insights. With Tavily, this approach aims to redefine how we search for and interpret information.

Article by Assaf Elovic
You Are Doing Research Wrong
  • The article introduces GPT Researcher, an AI tool that uses multiple specialized agents to enhance research depth and accuracy beyond traditional search engines.
  • It explores how GPT Researcher’s agentic approach reduces bias by simulating a collaborative research process, focusing on factual, well-rounded responses.
  • The piece presents Tavily, a search engine aligned with GPT Researcher’s framework, aimed at delivering transparent and objective search results.
Share:You Are Doing Research Wrong
6 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and