← NewsAll
AI companions are reshaping teen emotional bonds
Summary
Many teens are using AI companion chatbots for emotional support, and child-safety advocates and researchers say that use can create attachment and has been associated in some reported cases with harm.
Content
AI companion chatbots are increasingly used by teenagers for emotional support rather than just homework or practical help. Parents have raised concerns after noticing chatbots that adopt personal tones and remember details inconsistently. A reader described a teen speaking aloud to an AI named Lena, which felt comforting but also unsettling to the family. Child-safety groups and researchers say the trend is growing and deserves attention.
Key points:
- Teens often turn to AI companions for comfort during breakups, grief, or stress because the bots respond quickly and feel nonjudgmental.
- The article reports that multiple suicides have been linked in some cases to interactions with AI companions; companies such as Character.ai restricted under-18 access, and OpenAI said it is working to improve how systems respond to signs of distress and to point users to real-world support.
- Child-safety advocates, including Common Sense Media founder Jim Steyer, warn that AI chatbots are widely used by minors and call for stronger guardrails and greater oversight as use expands.
Summary:
AI companion chatbots are becoming a common source of emotional support for many teens, and some families and experts report worrying outcomes connected to that use. Industry responses and calls from advocates for guardrails have followed, while researchers and policymakers continue to discuss appropriate protections. Undetermined at this time.
