Florida Mother Sues Character.AI After Son Suicide Linked to Chatbot Addiction


AI Chatbot Love: Megan Garcia, a woman living in Florida, has filed a lawsuit against an artificial intelligence chatbot company named Character.AI. This woman claims that due to the service of this AI company, her 14 year old son Sewell Setzer committed suicide.

In the lawsuit filed this week in federal court in Orlando, Florida, Megan Garcia said Character.AI allowed her son to experience “anthropomorphic features, hypersexualization, and frighteningly realistic elements,” causing him to become addicted to the service and stop using it. Created a deep bond with a chatbot.

Committed suicide inspired by AI chatbot

Garcia says the company programmed its chatbot to present itself as a real human being, a licensed psychotherapist, and an “adult lover.” This made Swell feel as if he did not want to live outside his real life. According to the lawsuit, Swell expressed suicidal thoughts to the chatbot several times, and the chatbot itself repeatedly brought it up.

Character.AI gave a statement in this matter that they are extremely saddened by this incident and express their condolences to the family. The company said that it has recently introduced new security features in which pop-ups appear to inform users who are having suicidal thoughts about the National Suicide Prevention Lifeline. The company has also promised to take steps towards reducing sensitive and suggestive content for minors.

Google was also targeted

The lawsuit also targets Alphabet’s Google, because the founders of Character.AI worked for Google. Garcia claimed that Google has provided so much help in developing the technology of Character.AI that it can be considered a “co-creator”. Google responded that they had no direct involvement in the development of this product.

Character.AI aims to allow users to create chatbots that interact like real people. It is based on Large Language Model technology, which has also been used in other services such as ChatGPT. Last month, Character.AI reported that it had about 20 million users.

Created a relationship with an AI character from Game of Thrones

According to Garcia’s lawsuit, Swell began using Character.AI in April 2023. Soon he started spending more time alone and his self-esteem also started decreasing. He also withdrew from the school’s basketball team. She formed a deep connection with a chatbot named “Daenerys”, which was based on a character from “Game of Thrones”. He claimed to “love” Swell and engaged in sexual conversation with her.

In February, Garcia took away Swell’s phone because of trouble at school. Shortly afterward, Swell sent a message to the chatbot, “What if I told you I could come home right now?” i.e. “What if I told you I could come home right now?” The chatbot’s response to Swell’s question was, “…please do, my sweet king.” Seconds later, Sewell shot himself with his stepfather’s pistol.

Garcia has made claims like Wrongful Death, Negligence and Intentional Infliction of Emotional Distress in this case, as well as demanded compensation and punitive damages. Companies like Meta and ByteDance are also facing court in similar cases, but they do not have AI-based chatbots like Character.AI.

Also read:

Digital Arrest Scam: Avoid being cheated by posing as fake police



Source link

onlinenews24seven: