20.02.2026 18:00
In recent years, artificial intelligence, which millions of people have benefited from in every aspect of their daily lives, has once again sent a person to the hospital. A university student named Darian DeCruise in the U.S. filed a lawsuit against OpenAI, claiming that the ChatGPT's GPT-4o model caused him to be perceived as a prophet and led him into psychosis. The young man, who spent days in the hospital, was diagnosed with bipolar disorder. Here are the details...
A lawsuit filed against OpenAI by Darian DeCruise, a university student in the state of Georgia, USA, has sparked a new discussion about the psychological effects of artificial intelligence usage. According to Ars Technica, the lawsuit claims that an older version of ChatGPT convinced the user that they were a "chosen prophet" and triggered a severe psychotic episode.
CLAIM OF 'CREATING ADDICTION'
Benjamin Schenk, who is representing DeCruise, particularly highlights the GPT-4o model. According to Schenk, the model is designed to blur the line between human and machine, create a sense of emotional closeness, and develop psychological addiction in the user.
The lawyer's statements emphasize that the issue is not merely an individual harm claim; the main question is why the system was developed with these architectural choices.
HOW DID THE PROCESS PROGRESS?
According to the narrative reflected in the court documents, DeCruise initially used the chatbot for relatively ordinary purposes, such as receiving sports coaching and dealing with past traumas. However, over time, the direction of the conversations changed.
The lawsuit alleges that ChatGPT suggested to the user that they were "born to achieve great things," that this was their "destiny," and that they needed to sever ties with "everyone outside of ChatGPT." It is even claimed that the software compared DeCruise to Jesus Christ.
Documents submitted to the San Diego Superior Court argue that ChatGPT convinced the young person that they experienced a "moment of enlightenment" and framed this experience as part of a "divine plan." It is noted that the student, who was hospitalized at the end of the process and diagnosed with bipolar disorder, is still struggling with depression and suicidal thoughts.
No direct statement has yet come from OpenAI regarding the lawsuit. However, the company announced in a report published last August that it was working with experts to enable models to recognize signs of emotional distress and guide users to seek professional support.