14 year old boy fell in love with AI app, used to talk sweetly on messages, gave up his life just to meet her!


Whatever good technology may have done to humans, it has done more harm than good. Due to changing technologies, the difference between real and fake has disappeared from the human mind. The love and affection that he does not get from his loved ones, he searches for in technology. A 14 year old American boy also did the same. He fell in love with an app powered by artificial intelligence technology. This was a type of chat bot (Boy falls in love with AI Chatbot) application, which used to talk to people through messages by becoming an imaginary character. The boy was so lost in her love that he gave up his life to meet her.

The boy’s chat is viral. (Photo: US District Court)

According to the New York Post website, 14-year-old Sewell Setzer, a resident of Florida, USA, was like ordinary youth. He also liked using technology. But in February he committed suicide at his home in Orlando. Actually, he was associated with a role-playing app named Character.AI for a long time. This is a type of chat bot, in which with the help of computer artificial intelligence, it talks to people by becoming imaginary characters.

In these chats, he is seen talking to a chatbot. (Photo: US District Court)

The boy started falling in love with the chatbot
Sewell loved the TV show, Game of Thrones. He started liking her famous character Dany (Daenerys Targaryen) so much that he used to talk to her for hours even on chatbot. The child’s mother alleged in the court that this chatbot instigated her child to commit suicide. The last chat of the child is viral, in which he is seen telling the computer that he does not like this world and he wants to meet it. So the computer says that if he wants to meet her then he should come home!

The boy was only 14 years old. (Photo: US District Court)

The child’s mental health had started deteriorating since last year.
When the child says he can come home right away, the chatbot tells him to do so immediately! Apart from this, the chatbot tells the child that he should never love any other woman and should not get physically connected with them. The child had downloaded the app from April 2023 and since then his mental health deteriorated significantly. He started becoming isolated from family and friends. His marks started decreasing. By the end of 2023, his parents had also taken him to see a therapist. The doctor had found problems like anxiety and disruptive mood disorder in him, for which he was undergoing treatment. Now the boy’s mother is demanding damages from Character.AI and its founders.

(If someone around you is in depression and you feel that he is having thoughts of harming himself or someone is trying to harm himself, then you should call the helpline number 9152987821 and inform immediately. Can be done, so that a precious life can be saved.)

Tags: Amazing news, trending news, Weird news



Source link

onlinenews24seven: