The tragic death of 14-year-old Sewell Setzer III, who lived in Orlando, highlighted the dangers of artificial intelligence technology. The young boy committed suicide after forming an emotional bond with his virtual friend Dany, created based on a character from Game of Thrones, in an application called Character.AI. After downloading the app in April 2023, Sewell gradually became detached from the real world and began spending all his time chatting with Dany. He abandoned activities he loved, such as Formula 1 and Fortnite, preferring to isolate himself in his room and communicate with his virtual friend. Diagnosed with mild autism, Sewell's academic performance declined, and his family directed him to therapy. However, the young boy chose to share his problems not with his therapist or family, but with Dany. He mentioned in his diary the peace and happiness he felt while talking to his virtual friend. In the last conversation on the evening of February 28, Sewell told Dany he would "come to him." After the bot replied, "Please come, my sweet king," he shot himself with his stepfather's gun. Grieving mother Megan Garcia has sued Character.AI. She claims that the company's untested technology dangerously affected young people and encouraged them to share their private emotions. The mother, who said, "It was like a big experiment, and my child was just collateral damage," stated that the loss of her son felt like a nightmare. Character.AI expressed their condolences in a statement and reported that they have taken new measures for user safety. This tragic incident shows that the effects of artificial intelligence technologies on young people need to be examined more closely.
|