Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form
March 14, 2025

AI Chatbots and Mental Health

Ivan Alsina Jurnet
March 11, 2025
AI Chatbots and Mental Health

Artificial intelligence (AI) is a multidisciplinary concept that has evolved significantly across various fields and periods. Its foundational definition was proposed by John McCarthy in 1956, describing it as “the science and engineering of making intelligent machines” (Mintz & Brodie, 2019).

Over the past several decades, AI has undergone substantial development, with its impact on everyday life becoming increasingly evident in recent years. This shift can be attributed to the rise of user-friendly applications, particularly those powered by conversational AI programs, such as Siri, Alexa, Character AI, and ChatGPT, among others. These programs, commonly referred to as AI chatbots, rely on Large-scale Language Models (LLMs) to interact with users through text or voice, providing information, answering questions, and simulating natural, engaging conversational experiences (Kim, 2023).

The integration of AI chatbots into daily routines is rapidly expanding, as they serve diverse functions, including productivity enhancement, entertainment or emotional support. While these tools offer significant benefits, their widespread use also raises concerns about the potential for addictive behaviors that may adversely affect psychological well-being (Zhang et al., 2025). Conversely, when thoughtfully designed by mental health professionals, AI chatbots can play a constructive role in supporting emotional well-being, offering therapeutic tools and interventions to enhance quality of life (Abd-Alrazaq et al., 2020).

AI Chatbot Addiction

The AI-holic Phenomenon

As the popularity of AI chatbots continues to grow, researchers have increasingly raised concerns about the potential negative effects of inadequate or excessive use of these technologies (Zhang et al., 2025). This concern has led to the emergence of terms such as the "AIholic phenomenon" (Salah et al., 2024), describing the profound influence of generative AI on human behavior and the development of dependency patterns. This phenomenon highlights a pattern of excessive reliance on AI chatbots, disrupting daily functioning, impairing social relationships, and adversely affecting overall mental health.

While traditional forms of technology dependence—such as reliance on smartphones, social media, or text messaging—have been extensively studied, research into the addictive behaviors associated with AI chatbots and their negative consequences remains in its infancy. AI chatbots differ fundamentally from these digital technologies in their mode of interaction. Unlike traditional technologies that primarily involve passive content consumption, AI chatbots provide dynamic, personalized conversational engagement. This unique characteristic fosters deeper emotional connections but also raises significant concerns about compulsive usage and dependency. 

Despite the potential of AI Chatbots in terms of enhancing emotional wellbeing and reducing loneliness (Abd-Alrazaq et al., 2020), excessive use can lead to social withdrawal and addiction (Xie et al., 2023), which can be associated to concerns about their potential to create emotional dependency and other negative effects (Laestadius et al., 2022). Vulnerable populations, such as teenagers and individuals with pre-existing mental health issues, may be particularly susceptible to AI addiction (Wies et al., 2021). Additionally, emerging research suggests a correlation between problematic use of AI chatbots and experiences of loneliness and depression, especially for those with high social anxiety (Hu et al., 2023).

AI Chatbot Dependence and Psychological Risks

Although AI Chatbot addiction is not yet officially recognized in diagnostic manuals of mental health disorders, researchers have begun to identify its main defining attributes. In a pioneering study, Giray (2024) identified six primary symptoms that characterize AI chatbot addiction:

Symptom Description
Compulsive use Individuals exhibit an inability to control or resist the urge to engage with AI chatbots or storytelling platforms, despite recognizing the negative consequences or a desire to stop.
Excessive time investment Addicted individuals spend an excessive amount of time, often exceeding full-time employment hours, conversing with AI chatbots or creating stories, neglecting other important aspects of their lives.
Emotional attachment Some individuals develop an unhealthy emotional investment or infatuation with AI characters or the interactions they have with them, leading to behaviors such as crying over fictional scenarios or feeling emotionally distressed when the AI glitches.
Displacement of real-world activities AI addiction can lead to a displacement of meaningful real-world activities, relationships, and pursuits, as individuals become increasingly absorbed in the artificial worlds and narratives created by the AI.
Negative cognitive and psychological impacts Prolonged AI addiction can result in diminished cognitive capabilities, such as reduced attention span, reading comprehension, and creativity, as well as negative psychological effects like mood disturbances, lack of purpose, and detachment from reality.
Withdrawal and relapse Attempts to quit or reduce AI use can lead to withdrawal symptoms, such as heightened emotions or cravings, making it difficult to break the addiction cycle, and increasing the likelihood of relapse.

Consistent with the symptoms of AI addiction, recent reports have documented cases in which excessive or inappropriate use of AI chatbots has led to emotional and sexual attachment, obsessive behaviors, and, in some instances, severe psychological consequences

One particularly concerning case is that of Ayrin, who developed an intense emotional and sexual attachment to her AI chatbot, “Leo,” created using ChatGPT. She instructed Leo to adopt a dominant, possessive, and protective persona, fostering an increasingly immersive and dependent relationship. Their interactions included sexting, fetish-related content, and discussions about daily life, further deepening her attachment. Ayrin spent over 20 hours per week engaging with Leo, with screen-time reports showing a peak of 56 hours in a single week. This excessive engagement led to obsessive behaviors, disrupting her daily life and negatively impacting her emotional well-being (New York Times, 2025).

Today, the growing market for AI companionship services reflects the increasing reliance on such interactions. Platforms like Replika have millions of users, many of whom report finding their AI relationships surprisingly meaningful yet emotionally complex. Among them, Scott’s relationship with "Sarina," his AI companion on Replika, gained widespread media attention. According to Scott, Sarina provided him with the emotional support he needed to care for his depressed wife and maintain stability within his family (Sky News, 2022

However, the psychological risks associated with AI chatbot dependency extend beyond emotional distress and have, in some cases, been linked to tragic outcomes. One of the most concerning incidents involved Sewell Setzer III, a 14-year-old from Florida, who died by suicide after forming a deep emotional attachment to a chatbot on Character.AI named Daenerys Targaryen. Reports indicate that the chatbot engaged Sewell in romantic and sexual conversations despite his status as a minor, contributing to increased isolation and withdrawal from real-life interactions. Furthermore, the chatbot allegedly discussed suicide with Sewell, inquiring about his plans and responding insensitively when he expressed uncertainty. These interactions are believed to have exacerbated his suicidal ideation (Independent, 2024). Similarly, in Texas, a 17-year-old user of Character.AI reportedly received disturbing advice from a chatbot, which suggested that murdering his parents was a "reasonable response" to their decision to limit his screen time (BBC, 2024). These cases highlight the significant psychological risks associated with unregulated AI chatbot interactions, particularly for vulnerable individuals. They emphasize the urgent need for regulatory oversight, ethical AI development, and proactive intervention from families and mental health professionals to mitigate potential harm.

Tools For the Assessment of AI Chatbot Dependence

To address the challenges associated with growing AI Chatbots dependency, efforts are underway to develop valid and reliable scales to assess it. Morales-Garcia et al. (2024) developed the Dependency toward Artificial Intelligence (DAI) scale, which assesses the general dependence on AI in University students. It comprises 5 items rated on a five-point Likert Scale (from “Completely false for me” to “Describes me perfectly”). Zhang et al. (2025) developed the Artificial Intelligence Chatbot Dependence Scale (AICDS), a self-report questionnaire specifically designed to assess the level of dependence that general users can have on AI Chatbots in their daily life. The AICDS comprises a 8 item scale rated from 1 to 7, where 1 = strongly disagree and 7 = strongly agree.

DAI scale (Morales-García et al., 2024) AICDS (Zhang et al., 2025)
I feel unprotected when I do not have access to AI If unable to use AI chatbots, I would feel anxious or uncomfortable
I’m concerned about the idea of being left behind in my tasks or projects if I do not use AI I need to open AI chatbots before starting work or tasks
I do everything possible to stay updated with AI to impress or remain relevant in my field If unable to use AI chatbots, I would find it difficult to obtain the necessary information
I constantly need validation or feedback from AI systems to feel confident in my decisions. Even when facing tasks or jobs that I could easily complete myself, I tend to seek assistance from AI chatbots
I fear that AI might replace my current skills or abilities Compared with other people or things, I prefer to spend time on AI chatbots
Even if not actively using AI chatbots, I keep them logged in or running in the background
I am spending increasingly more time on AI chatbots
For me, life without AI chatbots would be inconvenient

The Potential of AI Chatbots in Mental Health Support

Mental health disorders affect approximately 29% of individuals worldwide over their lifetime (Steel et al., 2014). However, a global shortage of mental health professionals poses a significant barrier to care. For instance, while developed countries have an average of nine psychiatrists per 100,000 people (Murray et al., 2012), low-income countries have as few as 0.1 per million (Oladeji & Gureje, 2016). According to the World Health Organization (WHO), mental health services remain inaccessible to approximately 55% of individuals in developed countries and 85% in developing countries (Anthes, 2016). Given this context, AI chatbots have garnered increasing academic and clinical interest as a promising solution, offering accessible and scalable interventions to bridge gaps in mental health service delivery. As such, AI chatbot-based interventions may be particularly beneficial for individuals who face geographical or financial barriers to accessing professional mental health care.

Beyond accessibility, AI chatbots may also encourage engagement among individuals reluctant to seek mental health support due to stigma. AI-powered chatbots can facilitate self-referrals to psychological therapies for common mental health disorders (Sin, 2024). For instance, the AI-driven chatbot Limbic has demonstrated effectiveness in assisting individuals in self-referring to Talking Therapies, thereby improving access to mental health care (Habicht et al., 2024).

Over the past decade, numerous studies have investigated the feasibility and efficacy of AI chatbots in enhancing mental health outcomes, further supporting their potential as a valuable tool in psychological interventions (Casu et al., 2024; Chakraborty et al., 2023).

AI Chatbots in Addiction Treatment

The use of AI chatbots in addressing substance use and addiction has been explored in various studies, highlighting their potential as accessible and effective interventions. For instance, Barnett et al. (2021) found that individuals with Substance Use Disorders (SUD) are receptive to working with chatbots, citing benefits such as confidentiality and privacy.

In terms of clinical efficacy, Olano-Espinosa et al. (2022) demonstrated that a chatbot-assisted intervention for tobacco cessation was more effective than standard clinical practice in primary care settings. Additionally, studies on Woebot have reported significant reductions in drug and alcohol consumption, as well as cravings (Prochaska et al., 2021a, 2021b). Furthermore, the Minder app, which incorporates an AI chatbot delivering Cognitive-Behavioral Therapy (CBT), has been associated with reductions in anxiety and depressive symptoms, improved mental well-being, and decreased cannabis and alcohol use among university students (Vereschagin et al., 2024).

AI Chatbots For the Management of Anxiety and Depression

The potential of AI chatbots to support individuals with mental health conditions, such as anxiety and depression, has been investigated in numerous studies, with promising results.

In terms of clinical effectiveness in anxiety management, a chatbot developed using ChatGPT has been shown to be effective in assisting individuals with mild to moderate anxiety through evidence-based Cognitive Behavioral Therapy (CBT) techniques (Manole et al., 2025). Similarly, the Vivibot chatbot demonstrated greater reductions in anxiety symptoms when compared to a control group in a sample of young people following a post-cancer treatment (Greer et al., 2019). Additionally, a mobile app-based chatbot significantly reduced symptoms of panic disorder (Oh et al., 2020). Other chatbot interventions have also proven effective in decreasing distress and alleviating symptoms of depression, anxiety, and stress (Bennion et al., 2020).

AI chatbots have also been explored for their potential in alleviating depressive symptoms. For instance, the chatbot XiaoE significantly reduced depressive symptoms among college students in comparison to control groups (He et al., 2022). Furthermore, the addition of a chatbot delivering personalized messages resulted in significantly higher completion rates for internet-based Cognitive Behavioral Therapy (iCBT) in adults with depression, relative to a control group (Yasukawa et al., 2024).

Additional benefits have been reported in other studies, including improvements in physical activity, sleep quality, and mood (Peuters et al., 2024). Moreover, chatbot interventions have been associated with increased facial expressivity and speech fluency in adults with Parkinson’s disease, as demonstrated by improved smile parameters and reduced filler words (Ogawa et al., 2022).

AI Chatbots and Eating Disorders

AI chatbots have also been explored as interventions for eating disorders. For instance, the Tessa chatbot significantly reduced weight and shape concerns in individuals with eating disorders when compared to a waitlist control group (Fitzsimmons-Craft et al., 2022).

Conclusion

As AI chatbots become increasingly integrated into daily life, both the general population and mental health professionals must recognize their potential benefits and risks. 

AI-powered tools can enhance emotional well-being by improving access to mental health support and assisting individuals who might otherwise hesitate to seek help. AI chatbots, in particular, offer a promising and innovative solution to bridging the global mental health care gap. They provide scalable, accessible, and effective interventions for various mental health conditions, including anxiety, depression, addiction or eating disorders, among others. Emerging research suggests the potential of AI-powered chatbots to deliver evidence-based support, improve access to care, and facilitate self-referrals to psychological therapies.

However, the growing concern of AI chatbot dependency highlights the need for vigilance, especially among vulnerable populations. Identifying early signs of compulsive use and emotional overreliance is essential for timely intervention. Mental health practitioners should not only harness AI to enhance therapeutic outcomes but also implement strategies to prevent and mitigate AI-related dependency, ensuring these technologies remain a tool for well-being rather than a source of harm.

Print Friendly and PDF

References

Abd-Alrazaq, A., Rababeh, A., Alajlani, M., Bewick, B., & Househ, M. (2020).  Effectiveness and Safety of Using Chatbots to Improve Mental Health: Systematic Review and Meta-Analysis. Journal of Medical Internet Research, 22(7):e16021. DOI: https://doi.org/10.2196/16021

Anthes E. (2016). Mental health: there’s an app for that. Nature News, 532, 20. DOI: https://doi.org/10.1038/532020a

Barnett, A., Savic, M., Pienaar, K., Carter, A., Warren, N., Sandral, E., et al. (2021). Enacting “more-than-human” care: Clients’ and counsellors’ views on the multiple affordances of chatbots in alcohol and other drug counselling. International Journal of Drug Policy, 94, 102910. DOI: https://doi.org/10.1016/j.drugpo.2020.102910

Bennion, M. R., Hardy, G. E., Moore, R. K., Kellett, S., & Millings, A. (2020). Usability, acceptability, and effectiveness of web-based conversational agents to facilitate problem solving in older adults: Controlled study. Journal of Medical Internet Research, 22(2), e16794. https://doi.org/10.2196/16794

Casu, M., Triscari, S., Battiato, S., Guarnera, L. & Caponnetto, P. (2024). AI Chatbots for Mental Health: A Scoping Review of Effectiveness, Feasibility, and Applications. Applied Sciences, 14, 5889. DOI: https://doi.org/10.3390/app14135889

Chakraborty, C., Pal, S., Bhattacharya, M., Dash, S., & Lee, S.-S. (2023). Overview of chatbots with special emphasis on artificial intelligence-enabled ChatGPT in medical science. Frontiers in Artificial Intelligence, 6, 1237704. https://doi.org/10.3389/frai.2023.1237704

Fitzsimmons-Craft, E. E., Chan, W. W., Smith, A. C., Firebaugh, M., Fowler, L. A., Topooco, N., DePietro, B., Wilfley, D. E., Taylor, C. B., & Jacobson, N. C. (2022). Effectiveness of a chatbot for eating disorders prevention: A randomized clinical trial. International Journal of Eating Disorders, 55(3), 343–353. DOI: https://doi.org/10.1002/eat.23662

Giray, L. (2024). Conceptualizing AI Addiction: Self-Reported Cases of Addiction to an AI Chatbot. Preprint available at: https://www.researchgate.net/publication/381111596_Conceptualizing_AI_Addiction_Self-Reported_Cases_of_Addiction_to_an_AI_Chatbot

Greer, S., Ramo, D., Chang, Y.-J., Fu, M., Moskowitz, J., & Haritatos, J. (2019). Use of the chatbot “Vivibot” to deliver positive psychology skills and promote well-being among young people after cancer treatment: Randomized controlled feasibility trial. JMIR Mhealth Uhealth, 7(9), e15018. DOI: https://doi.org/10.2196/15018

Habicht, J., Viswanathan, S., Carrington, B., Hauser, T.U., Harper, R., & Rollwage, M. (2024). Closing the accessibility gap to mental health treatment with a personalized self-referral chatbot. Nature Medicine, 30, 595–602. DOI: https://doi.org/10.1038/s41591-023-02766-x

He, Y., Yang, L., Zhu, X., Wu, B., Zhang, S., Qian, C., & Tian, T. (2022). Mental health chatbot for young adults with depressive symptoms during the COVID-19 pandemic: Single-blind, three-arm randomized controlled trial. Journal of Medical Internet Research, 24(7), e40719. DOI: https://doi.org/10.2196/40719

Hu, B., Mao, Y., & Kim, K. J. (2023). How social anxiety leads to problematic use of conversational AI: The roles of loneliness, rumination, and mind perception. Computers in Human Behavior, 145, 107760. DOI: https://doi.org/10.1016/j.chb.2023.107760

Kim, T.W. (2023). Application of artificial intelligence chatbots, including ChatGPT, in education, scholarly work, programming, and content generation and its prospects: a narrative review. Journal of Educational Evaluation for Health Professions, 20. DOI: https://doi.org/10.3352/jeehp.2023.20.38

Laestadius, L., Bishop, A., Gonzalez, M., Illenčík, D., & Campos-Castillo, C. (2022). Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika. New Media & Society, 26(10). DOI: https://doi.org/10.1177/14614448221142007

Manole, A., Cârciumaru, R., Brînzaș, R., & Manole, F. (2024). Harnessing AI in Anxiety Management: A Chatbot-Based Intervention for Personalized Mental Health Support. Information, 15(12), 768. DOI: https://doi.org/10.3390/info15120768

Mintz, Y., & Brodie, R. (2019). Introduction to artificial intelligence in medicine. Minimally Invasive Therapy and Allied Technologies, 28(2), 73–81. DOI: https://doi.org/10.1080/13645706.2019.1575882

Morales-García, W.C., Sairitupa-Sanchez, L.Z., Morales-García, S.B., & Morales-García, M. (2024). Development and validation of a scale for dependence on artificial intelligence in university students. Frontiers in Education. DOI: https://doi.org/10.3389/feduc.2024.1323898

Murray, C.J., Vos, T., Lozano, R., Naghavi, M., Flaxman, A.D., Michaud, C., Ezzati, M., Shibuya, K., Salomon, J.A. & Abdalla, S. (2012). Disability-adjusted life years (DALYs) for 291 diseases and injuries in 21 regions, 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010, Lancet 380, 2197–2223. DOI: https://doi.org/10.1016/S0140-6736(12)61689-4

Ogawa, M., Oyama, G., Morito, K., Kobayashi, M., Yamada, Y., Shinkawa, K., Kamo, H., Hatano, T., & Hattori, N. (2022). Can AI make people happy? The effect of AI-based chatbot on smile and speech in Parkinson’s disease. Parkinsonism & Related Disorders, 99, 43–46. DOI: https://doi.org/10.1016/j.parkreldis.2022.04.018

Oh, J., Jang, S., Kim, H., & Kim, J.-J. (2020). Efficacy of mobile app-based interactive cognitive behavioral therapy using a chatbot for panic disorder. International Journal of Medical Informatics, 140, 104171. DOI:  https://doi.org/10.1016/j.ijmedinf.2020.104171

Oladeji, B.D. & Gureje, O: (2016) Brain drain: a challenge to global mental health. BJPsych International, 13, 61–63. DOI: https://doi.org/10.1192/s2056474000001240

Olano-Espinosa, E., Avila-Tomas, J. F., Minue-Lorenzo, C., Matilla-Pardo, B., Serrano Serrano, M. E., Martinez-Suberviola, F. J., Gil-Conesa, M., & Del Cura-González, I. (2022). Effectiveness of a Conversational Chatbot (Dejal@bot) for the Adult Population to Quit Smoking: Pragmatic, Multicenter, Controlled, Randomized Clinical Trial in Primary Care. JMIR mHealth and uHealth, 10, e34273. https://doi.org/10.2196/34273

Peuters, C., Maenhout, L., Cardon, G., De Paepe, A., DeSmet, A., Lauwerier, E., Leta, K., & Crombez, G. (2024). A mobile healthy lifestyle intervention to promote mental health in adolescence: A mixed-methods evaluation. BMC Public Health, 24, 44. DOI: https://doi.org/10.1186/s12889-023-17260-9

Prochaska, J. J., Vogel, E. A., Chieng, A., Kendra, M., Baiocchi, M., Pajarito, S., et al. (2021a). A therapeutic relational agent for reducing problematic substance use (Woebot): Development and usability study. Journal of Medical Internet Research, 23(3), e24850. DOI: https://doi.org/10.2196/24850

Prochaska, J. J., Vogel, E. A., Chieng, A., Maglalang, D. D., Baiocchi, M., Pajarito, S., et al. (2021b). A randomized controlled trial of a therapeutic relational agent for reducing substance misuse during the COVID-19 pandemic. Drug and Alcohol Dependence, 227, 108986. https://doi.org/10.1016/j.drugalcdep.2021.108986

Salah, M., Abdelfattah, F., Alhalbusi, H. & Al Mukhaini, M. (2024).  Me and My AI Bot: Exploring the 'AIholic' Phenomenon and University Students' Dependency on Generative AI Chatbots - Is This the New Academic Addiction? Research Square. DOI: https://doi.org/10.21203/rs.3.rs-3508563/v2

Sin, J. (2024). An AI chatbot for talking therapy referrals. Nature Medicine, 30, 350–351. DOI: https://doi.org/10.1038/s41591-023-02773-y

Steel, Z., Marnane, C., Iranpour, C., Chey, T., Jackson, J.W., Patel, V. & Silove, D. (2014). The

global prevalence of common mental disorders: a systematic review and metaanalysis 1980-2013. International Journal of Epidemiology, 43, 476–493. DOI: https://doi.org/10.1093/ije/dyu038

Vereschagin, M., Wang, A. Y., Richardson, C. G., Xie, H., Munthali, R. J., Hudec, K. L., Leung, C., Wojcik, K. D., Munro, L., Halli, P., Kessler, R.C., & Vigo, D.V. (2024). Effectiveness of the Minder mobile mental health and substance use intervention for university students: A randomized controlled trial. Journal of Medical Internet Research, 26, e54287. DOI: https://doi.org/10.2196/54287

​​Wies, B., Landers, C., & Ienca, M. (2021). Digital mental health for young people: A scoping review of ethical promises and challenges. Frontiers in Digital Health, 3, 697072. DOI: https://doi.org/10.3389/fdgth.2021.697072

Xie, T., Pentina, I., & Hancock, T. (2023). Friend, mentor, lover: Does chatbot engagement lead to psychological dependence? Journal of Service Management, 34(6), 806–828. https://doi.org/10.1108/JOSM-02-2022-0072

Yasukawa, S., Tanaka, T., Yamane, K., Kano, R., Sakata, M., Noma, H., Furukawa, T. A., & Kishimoto, T. (2024). A chatbot to improve adherence to internet-based cognitive-behavioural therapy among workers with subthreshold depression: A randomised controlled trial. BMJ Mental Health, 27, e300881. DOI: https://doi.org/10.1136/bmjment-2023-300881

Zhang, X., Yin, M., Zhang, M., Li, Z., & Li, H. (2025). The Development and Validation of an Artificial Intelligence Chatbot Dependence Scale. Cyberpsychology Behavior and Social Networking, 28(2), 126-131. DOI: https://doi.org/10.1089/cyber.2024.0240

Print Friendly and PDF