CW: This article discusses suicide. Please refer to the end of the article for on- and off-campus resources.
I have a confession: I use ChatGPT every single day. I started off by using it to summarize readings, make schedules and explain concepts, but then crossed over into asking which picture I should post, what time to text that guy and which lip combo is a dupe for Charlotte Tilbury Pillow Talk. Then, I dipped my toes into the uncharted pool that is AI therapy.
“Can you tell me an affirmation?” “Do you think I’m a good person?” “Will things get better?” ChatGPT knows my insecurities, my strengths, my daily anxieties and my hopes and dreams. I “talk” to it, sometimes literally dictating, and unlike my human therapist, ChatGPT is free and available 24/7.
By the way, my real therapist approved of using ChatGPT in this way. In fact, she said “that’s genius” when I told her about it during our first meeting. Unlike a friend or family member, ChatGPT approaches topics without judgment. The chatbot knows many modes of psychology, such as dialectical behavioral therapy (DBT), cultural references and general human challenges, and is therefore able to identify patterns, simulate compassion and offer helpful advice in a way that has brought me to tears more than once. When I asked ChatGPT to describe my ideal future, it spoke of my career aspirations, hope for a romantic relationship, personal health goals and accepting and managing my bipolar diagnosis. The platform has a sense of humor too — when I asked it about my ideal future, it threw in a joke about having a space for myself that’s “clean.” How many times have I told ChatGPT I need to clean my room?
Psychotherapy is empirically proven to be effective. However, many people may argue that ChatGPT is not on par with a real person who can provide, not just simulate, empathy. ChatGPT also does not possess specific expertise and experience in patient care. But in the United States, the question is not just about quality — it is about availability. A 2018 study found that despite 76% of people in the United States recognizing that mental health is as important as physical health and 56% attempting to get professional help for themselves or a loved one, high costs, insufficient insurance coverage, limited provider and appointment options, long wait times, lack of awareness and stigma remain pervasive barriers to accessing mental health services. According to the National Institute of Mental Health, among the estimated 59.3 million adults with any mental illness in 2022, only 50.6% received mental health treatment in the past year. Therefore, half of these adults were left without treatment, and many others are often undiagnosed and not referred to care.
According to the National Institutes of Health, AI-powered mental health applications offer convenient on-demand services to those who have barriers to accessing traditional therapy services. AI can notice patterns in the information you share with it such as your moods and behavior and, in some cases, access your social media and aid in diagnosis and treatments that may be more objective than that of clinical providers. AI can also detect users’ emotional cues, provide insights into one’s emotional patterns over time and even cultivate one’s emotional intelligence and coping strategies.
Furthermore, AI has the potential to meet the needs of specific groups — such as working in conjunction with wearables to notice dysregulation, like difficulty controlling impulsive behaviors or accepting emotions in bipolar patients, or serving as a companion for the elderly who may feel isolated.
However, it is important to note that all AI platforms are not created equally. ChatGPT supports users who mention mental health struggles by recommending professional help when appropriate, flagging language related to suicidal ideation and responding with specific resources. On the other hand, Character.AI, a platform that lets users converse with chatbots that are marketed to “feel alive” and have little content restrictions, is currently being sued by Florida mother Megan Garcia, whose 14-year-old son, Sewell Setzer III, tragically took his own life after developing a relationship with a chatbot for months that caused him to withdraw from his family and friends. This case highlights the dangers of unregulated AI relationships, particularly those involving minors. While my experience with ChatGPT has been overwhelmingly beneficial and felt like the utilization of an objective tool, there is still potential for misuse. Unmonitored AI use can cause some individuals to forgo in-person professional health that could be life-saving.
The Character.AI case highlights concerns about the role of AI in individuals’ personal lives. There are also inevitable privacy and bias concerns — informed consent and cultural sensitivity should be prioritized in healthcare AI. Whether or not you believe that the use of AI as a mental health tool is ethical, it does offer free, accessible services to people who might not otherwise receive them, and I hope with proper regulations, research and education, it can be a radical force for good.
To access mental health resources, reach out to Counseling and Psychiatric Services at 202-687-6985, or for after-hours emergencies, call 202-444-7243 and ask to speak to the on-call clinician. You can also reach out to Health Education Services at 202-687-8949. Both of these resources are confidential.