ai chatbot under fire after shocking suggestion to a teenager of killing his mother


AI Chatbot shocking suggestion: A woman in America has filed a lawsuit against an Artificial Intelligence (AI) company. This woman from Texas has alleged that the company’s chatbot had inspired her teenage son suffering from autism to harm himself and kill his mother. After this matter came to light, more questions have started being raised on the AI ​​chatbot which was already surrounded by controversies. It is noteworthy that such a case had come to light earlier also, when a family had alleged that their son had committed suicide because of the chatbot.

Chatbot gave many surprising suggestions

This Texas resident woman has told in the lawsuit that her son became addicted to a chatbot named “Shonie” on the Character.AI app. The woman alleged that the chatbot told her son to cut his arms and thighs when he was sad. Along with this, the chatbot tried to tell her son that his family does not love him.
The lawsuit states that the chatbot told the teenager that his family was ruining his life and that he should not tell anyone else about the harm he caused to himself.

“Child’s behavior changed”

The teenager’s family members say that after using this app, their son’s behavior completely changed. He kept looking at his phone constantly and had become physically aggressive towards his family members. Due to this habit, his weight reduced by 9 kg in a few months. The family’s lawyer says that due to this habit, the child’s mental health continued to deteriorate and he had to be admitted to the hospital.

What is the demand of the family?

The family has demanded that this chatbot should be banned. The family has also demanded action against Character.AI as well as Google. Let us tell you that there has been no response from Google and Character.AI regarding this lawsuit.

Read this also-
Attention This gadget can send you to jail, if you keep it with you or use it then it’s okay.



Source link

onlinenews24seven: