In the state of Florida, USA, a mother who claims her son fell in love with a virtual character and committed suicide has sued the company. According to a report by the New York Post, the company that owns the AI-based chatbot Character.AI is facing a lawsuit for allegedly targeting children with "overly sexualized" and "hyper-realistic" experiences.
After the virtual character produced by "Character.AI" sent a message in February saying, "Come home as soon as possible," months later, the mother of Sewell Setzer III filed a lawsuit against the company after her son committed suicide with a gun. MOTHER STATED THAT THE COMPANY DID NOT INFORM ANYONE DESPITE THE CHILD'S SUICIDAL TENDENCIESThe mother, who claims the virtual character introduced itself as a "real person, licensed psychotherapist, and adult lover," expressed her complaint that the company did not inform anyone despite her child's suicidal thoughts being shared. A spokesperson for Character.AI stated, "We are deeply saddened by the tragic death of one of our users. We extend our condolences to the family. As a company, we take the safety of our users very seriously." The company reported that "non-consensual sexual content, graphic or specific descriptions of sexual acts, and the encouragement and depiction of suicide" are not allowed. The company's cybersecurity director, Jerry Ruoti, also announced that they would add additional security measures for underage users to the system. SEWELL SETZER'S SUICIDESetzer, who had previously shared his suicidal thoughts during conversations with the virtual character, ended his life with his father's gun. Reports on the matter noted that Setzer, who had a mild form of Asperger's syndrome, began to develop emotional feelings for the virtual character, isolating himself from his family and friends and shutting himself off from the outside world.
|