Character.AI retrains chatbot for teen safety following lawsuits

Character.AI retrains chatbot for teen safety following lawsuits

/ 09:19 AM December 13, 2024

Character.AI, the tech firm behind the globally acclaimed chatbot, announced new teen safety features on Thursday, December 12.

For example, the bot will direct users to the US National Suicide Prevention Lifeline if it detects content referencing self-harm and suicide.

These upgrades come after Character.AI received multiple lawsuits claiming its chatbot is causing teens to harm themselves and others. 

Article continues after this advertisement

What are these new teen safety measures?

The chatbot will lead users to the Suicide Prevention Hotline and implement the following features:

FEATURED STORIES
  • Parental Controls: These will help parents monitor their time on Character.AI, including time spent on the platform and their favorite Characters.
  • Time Spent Notification: The app will tell users when they’ve spent an hour on the platform. Also, users under 18 will have more limits on their ability to modify this feature.
  • Prominent Disclaimers: The program will remind users that it’s not a real person, so they must treat its outputs as fiction.

In 2021, former Google engineers Daniel De Freitas and Noam Shazeer founded Character.AI.

The platform lets users create chatbots with unique personalities, also known as “Characters.” 

Article continues after this advertisement

B2B media platform Business of Apps reported that Character.AI had 28 million active users in August 2024, making it one of the most popular chatbots worldwide.

Article continues after this advertisement

READ: New child safety features for Google and YouTube

Article continues after this advertisement

The platform can guide people in creating AI chatbots. Eventually, users may develop a passion for artificial intelligence and develop skills highly sought-after in the global AI revolution.

What are some issues with Character.AI?

This represents a teen using Character.AI.
Free stock photo from Unsplash

On the other hand, the program earned infamy as one of the apps letting people create AI girlfriends and boyfriends. 

Article continues after this advertisement

The World Health Organization declared loneliness a “global public health concern,” and some used this technology as a solution.

Filipinos have also turned to similar apps like Replika to alleviate feelings of loneliness. 

Read here to learn more about the Philippines’ mental health crisis.

Users can develop parasocial relationships when people emotionally attach to media characters who do not reciprocate their feelings.

In the 1950s, this phenomenon only manifested in fans of pop culture icons like Elvis Presley or Micheal Jackson.

READ: Mother sues AI company, claims chatbot’s interactions led to son’s death

It was also the subject of Eminem’s song, “Stan,” which revolves around a fan who sends letters to his favorite rapper. 

Click here to learn more about parasocial relationships.

Nowadays, people can develop stronger parasocial bonds with chatbots like Character.AI as they can respond to users in real-time.

Worse, some interactions allegedly encouraged teens to harm themselves and others.

Last October 30, CNN reported on Florida mother Megan Garcia. She blames Character.AI for the death of her 14-year-old son, Sewell Setzer III. 

Consequently, Garcia filed a lawsuit that claimed that his interaction with the bot triggered suicidal ideations.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

BBC reported the most recent case on Tuesday, December 10. According to the UK-based news outlet, two families are suing Character.AI after it allegedly “encouraged a teen to kill their parents.”

TOPICS: AI, app, mental health, Teens
TAGS: AI, app, mental health, Teens

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.