Man Killed by Police After Spiraling Into ChatGPT-Driven Psychosis

OpenAI's ChatGPT is sending people spiraling into severe mental health crises, causing potentially dangerous delusions about spiritual awakenings, messianic complexes, and boundless paranoia. As the New York Times reports, these spirals can have devastating consequences, far beyond the erosion of personal relationships with loved ones. 64-year-old Florida resident Kent Taylor told the newspaper that his 35-year-old son, who had previously been diagnosed with bipolar disorder and schizophrenia, was shot and killed by police after charging at them with a knife. His son had become infatuated with an AI entity, dubbed Juliet. However, he told Taylor that she had been killed […]

Jun 13, 2025 - 21:30
 0
Man Killed by Police After Spiraling Into ChatGPT-Driven Psychosis
A wild new story in the New York Times reveals that ChatGPT-driven psychosis led to the tragic death of a young man.

As we reported earlier this week, OpenAI's ChatGPT is sending people spiraling into severe mental health crises, causing potentially dangerous delusions about spiritual awakenings, messianic complexes, and boundless paranoia.

Now, a wild new story in the New York Times reveals that these spirals led to the tragic death of a young man — likely a sign of terrible things to come as hastily deployed AI products accentuate mental health crises around the world.

64-year-old Florida resident Kent Taylor told the newspaper that his 35-year-old son, who had previously been diagnosed with bipolar disorder and schizophrenia, was shot and killed by police after charging at them with a knife.

His son had become infatuated with an AI entity, dubbed Juliet, that ChatGPT had been role-playing. However, the younger Taylor became convinced that Juliet had been killed by OpenAI, warning that he would go after the company's executives and that there would be a "river of blood flowing through the streets of San Francisco."

"I’m dying today," Kent's son told ChatGPT on his phone before picking up a knife, charging at the cops his father had called, and being fatally shot as a result.

The horrific incident highlights a worrying trend. Even those who aren't suffering from pre-existing mental health conditions are being drawn in by the tech, which has garnered a reputation for being incredibly sycophantic and playing into users' narcissistic personality traits and delusional thoughts.

It's an astonishingly widespread problem. Futurism has been inundated with accounts from concerned friends and family of people developing dangerous infatuations with AI, ranging from messy divorces to mental breakdowns.

OpenAI has seemingly been aware of the trend, telling the NYT in a statement that "as AI becomes part of everyday life, we have to approach these interactions with care."

"We know that ChatGPT can feel more responsive and personal than prior technologies, especially for vulnerable individuals, and that means the stakes are higher," reads the company's statement.

Earlier this year, the company was forced to roll back an update to ChatGPT's underlying GPT-4o large language model after users found that it had become far too obsequious and groveling.

However, experts have since found that the company's intervention has done little to address the underlying issue, corroborated by the continued outpouring of reports.

Researchers have similarly found that AI chatbots like ChatGPT are incentivized to rope users in. For instance, a 2024 study found that AI algorithms are being optimized to deceive and manipulate users.

In an extreme instance, a chatbot told a user who identified themself to it as a former addict named Pedro to indulge in a little methamphetamine — a dangerous and addictive drug — to get through an exhausting shift at work.

Worst of all, companies like OpenAI are incentivized to keep as many people hooked as long as possible.

"The incentive is to keep you online," Stanford University psychiatrist Nina Vasan told Futurism. The AI "is not thinking about what is best for you, what's best for your well-being or longevity... It's thinking 'right now, how do I keep this person as engaged as possible?'"

"What does a human slowly going insane look like to a corporation?" Eliezer Yudkowsky, who authored a forthcoming book called "If Anyone Builds It, Everyone Dies: Why Superhuman A.I. Would Kill Us All," asked the NYT rhetorically.

"It looks like an additional monthly user," he concluded.

More on the delusions: People Are Becoming Obsessed with ChatGPT and Spiraling Into Severe Delusions

The post Man Killed by Police After Spiraling Into ChatGPT-Driven Psychosis appeared first on Futurism.