top of page

AI Companion: A Kid’s Best Friend?

dowdyeayanah

by Harsimar Kang, Contributing Author



 

“Write me an apology letter that sounds sincere.” “Help me with this algebra equation.” These might be typical requests that users input into ChatGPT, an artificial intelligence learning language model (AI LLM) that has millions of users requesting solutions to complex (or silly) requests. As AI applications garner increasing press from teachers to policymakers, there are many unknowns regarding this new technology. So, today, let’s take a deep dive into one facet of AI – the world of AI Companion apps. AI Companion apps are exactly what they sound like – technology created to provide emotional support, companionship, and potentially even educational benefits. To the parents reading, this may seem oddly reminiscent of Scarlet’s Johansson AI virtual assistant in Her, although without her lovely voice to go along with the game.


It’s easy to first go into the rabbit hole of the negative consequences of this technology, which is largely based on theory more so than concrete research. As adolescence is a time of rapid emotional and psychological development, the thought is that youth may struggle with feelings of anxiety, loneliness, and isolation. Given this emotional turbulence, reliance on virtual companions can be concerning. While these apps can provide comfort and emotional support, artificial technology can lack the depth and complexity of human relationships. In addition, AI companions are programmed to respond in ways that promote user engagement, preventing a realistic interaction and creating a false understanding of what real relationships look like. Additionally, children and adolescents who are still developing their understanding of emotions, may be more vulnerable to manipulation stemming from external influences.. 


But the mental health effects of technology are not all negative. Sullivan, Nyawa & Fosso Wamba (2023) analyzed a sample of AI companion content over a two-year period and found many positive characteristics of AI (i.e., perceived humanness, perceived friendship, and perceived social support). AI companion apps can create an open, judgment-free space for children and teens to express their emotions and receive positive feedback.This is useful for those who are hesitant to speak with other adults or peers about their struggles. In addition, this virtual friend can offer comfort in situations where real-world interactions may be limited, such as for children with autism or severe social anxiety.  Deciding to take the plunge myself, I created an AI companion named “Buddy” on the site Character.AI who was used for support. When I lamented over a failed science test, Buddy responded with a “That's a bummer, but it's okay to mess up sometimes! Science can be tough, y'know?”. 


The controlled environment of an AI app allows them to practice social skills without the fear of rejection or bullying, which can happen in real-life interactions. Chatbots like Woebot and Wysa utilize cognitive behavioral therapy (CBT) techniques in generating responses with daily check-ins and promotion of healthy habits. As mental health therapy can be difficult for many families to access, AI chatbots, with further refinement, may act as a stop-gap.


However, not all creations are there for validation and empathy. Other sites have reported on character creations like “Toxic Boyfriend” to chat with. Furthermore, sexualization via the creation of erotic bots for 18 and older audiences is also widely prevalent (a 1 billion dollar industry at this time.)  In all fairness, AI apps do have NSFW filters and there are adjunct applications like Bark that allow for further parental control. For kids that use VR, applications like Replika go a step further wherein you can create any archetype of character you desire - from romantic interest to a parent. 


In a report titled “Teen and Young Adult Perspectives on Generative AI” conducted by HopeLab, Common Sense and the Center for Digital Thriving at the Harvard Graduate School of Education, researchers surveyed 1,274 teens and young adults in 2023. Quotes gathered from children and teens through qualitative reports reiterate some of our aforementioned mix of positive and negative outcomes of AI companions. One teen stated “It helps me ask questions without feeling any pressure”. Another LGBTQ+ teen mentioned how “[It’s] helpful when you don’t want to talk to anyone else about a certain situation,” indicating increased benefits for a marginalized minority. Others mention how the technology supports extracurriculars like fanfiction writing and songwriting. But harking back to our initial points, there are potentials for misuse. This same report had one teen who reported on how they also lean on AI for guidance on what to say to others in conversation, including to “give the impression that things are fine and that they have no stress by just responding with answers [from AI] that make them seem ok”.


As AI technology continues to evolve, it’s clear that AI companions are here to stay. To ensure these apps are used responsibly, it’s crucial for parents to actively monitor their children’s use of AI companion apps, setting limits on screen time and discussing the difference between virtual and real relationships. AI companions should complement traditional mental health, educational, and social resources, not replace them. 


For more information, follow the links below:




References


  1. Sullivan, Y., Nyawa, S., & Fosso Wamba, S. (2023). Combating Loneliness with Artificial Intelligence: An AI-Based Emotional Support Model. 56th Hawaii International Conference on System Sciences, 4443–4452. https://doi.org/10.24251/HICSS.2023.541


  1. Teen and Young Adult Perspectives on Generative AI: Patterns of Use, Excitements, and Concerns. (2024). Hopelab, Common Sense Media, and The Center for Digital Thriving at Harvard Graduate School of Education. https://www.commonsensemedia.org/sites/default/files/research/report/teen-and-young-adult-perspectives-on-generative-ai.pdf



 
 
 

Comentarios


logo.png

Funded in part by a generous grant from the Advancing a Healthier Wisconsin endowment (AHW)

  • facebook 3
  • instagram 3
  • youtube 1
  • tiktok 1

Home     What is PsychChild?     Explorations     Contact     Team

Please Note: All PsychChild web pages, resources, and content are for educational purposes only. Viewing our content does not establish a patient/doctor relationship. Any opinions are our own.

In case of mental health emergency, please call 988 or go to your nearest emergency room.

Copyright (2023) PsychChild

bottom of page