Skip to main content

Is it safe to use AI in place of mental health care?

A stock image of a laptop in someone's lap.

AI chatbots are used more often for mental health reasons. | Matt Fowler KC – stock.adobe.com

More people are turning to artificial intelligence for mental health care, but is a chatbot capable of supporting someone through a crisis? 

As mental health wanes in the winter months, it's only natural to search for help. 

According to Psychology Today, a 2021 survey found some 22% of respondents reported positive interactions with a mental health chatbot. Due to convenience and a national shortage of licensed therapists, use of AI for mental health support has only grown. 

While AI can improve organization and eliminate menial tasks, Lindsay G. Flegge, PhD, assistant professor of clinical psychiatry at the Indiana University School of Medicine, discourages its use in place of mental healthcare for a variety of reasons. 

Question: What are some of the signs/symptoms/feelings to look for before contacting a mental health provider? 

Flegge: Most people struggle with life's challenges at some point or another. Whether you are concerned about current events, going through a life change, or experiencing grief or loss, it is common to wonder if the challenge has become "too much" to handle on your own. People may notice higher distress, more occupation with thinking about the problem, or a reduction in quality of life. If you have tried your usual list of coping skills and haven’t noticed any relief or if your symptoms have lasted for more than two weeks, it might be time to reach out to a mental health provider.

Q: Are AI chatbots the best places to share mental health concerns?

AI is evolving every second, and more people are starting to use AI chatbots as part of their daily lives. While it seems like AI can be a very convenient and affordable solution for sharing mental health concerns, there are many reasons I discourage it. 

Chatbots have not been taught to reliably recognize safety concerns or offer appropriate next steps. They often provide outdated or missing information, and chatbots have no guaranteed privacy or confidentiality guidelines. AI chatbots also cannot read body language like a trained mental health provider, and the results from an AI chatbot greatly depend on who is using it and how that person asks their questions. 

AI can also mislead people into securing a definitive answer to a problem instead of incorporating the nuances of life that are part of the human — not robot — experience. Our human problems are rarely simple enough for an instant, simple answer, and AI can lure us into accepting an easy solution when the real issue may be much more complex. 

AI solutions are also short-term and isolating within the context of using a chatbot compared to an ongoing, nurturing relationship with a physical mental health clinician. I heard a recent analogy recently about how it’s possible to get sushi from a gas station but it’s probably not going to be the best quality. Same goes for AI chatbots: They are always available, but you’ll likely not get the quality support that you need.

Q: Is AI qualified to diagnose mental health disorders?

Learning how to diagnose mental health disorders takes years of study, research and training under direct supervision. To become a psychologist, I needed a minimum of four years of post-graduate training, one year of supervised internship and one year of supervised postdoctoral fellowship, plus passing exams and meeting legal and ethical requirements.

Diagnosing mental health disorders requires significant knowledge about the human brain and behavior and is heavily regulated by professional licensing agencies. AI works more like solving a math problem — great for homework but not great for understanding the human condition. Using AI to diagnose mental health disorders is potentially dangerous, as you are likely to get misinformation that is not tailored to you or the uniqueness of your situation. Furthermore, AI cannot provide the treatment to correspond with the diagnosis, which leaves many people still searching for mental health care.

Q: Can AI be useful in other ways, such as combating loneliness? 

AI can help list ideas for engaging with other people, provide journal prompts, make a schedule for scheduling pleasant activities and much more. AI can be an adjunct to the skills you're learning in mental health therapy, with practical ideas for implementing recommendations on topics such as sleep hygiene, exercise and increasing social connections. 

AI should never take the place of real-life relationships, though. We need to interact with real people who can be part of our non-screen lives and journey with us through the ups and downs of life. AI can provide us with a recipe for dinner, but we still need someone to eat dinner with us. AI can help us draft an email, but we still need to send it and handle the outcome. AI can provide a list of local parks and activities, but we still need to go there with a friend.

Q: Should parents monitor children's AI chatbot use to look for potential signs of a mental health issue? 

I think parents should monitor all of children's online activity — not just AI chatbot use. You never know when concerns might pop up online, but it is just as important to have regular conversations with your children, too. Modeling taking care of your own mental health is also a great way to start conversations around mental health issues at home. 

Default Author Avatar IUSM Logo
Author

Rory Appleton

Rory Appleton is the public relations manager for IU School of Medicine, where he works to share the important work of faculty and students with the public. As the school's primary news media contact, Rory assists reporters in an effort to highlight the expertise and groundbreaking research that set Indiana's only medical school apart. Prior to holding this role, Rory spent 10 years as a news reporter in Indiana, Nevada and his native California. He holds a bachelor's degree in mass communication and journalism from California State University, Fresno.
The views expressed in this content represent the perspective and opinions of the author and may or may not represent the position of Indiana University School of Medicine.