-
Steakhouse Woes - 13 mins ago
-
South Africa Tavern Shooting: What We Know After At least 9 Killed - 24 mins ago
-
Michele Singer Reiner: A Life Rooted in Activism and Listening to Others - 57 mins ago
-
Winter Weather Warning As 40 Inches of Snow To Hit—Travel ‘Impossible’ - 59 mins ago
-
Shea Serrano’s book headlines great year for Latino sports books - about 1 hour ago
-
My Wife Gives Horrible Gifts. She Wouldn’t Take My Hints, So I Came Up With an Idea - 2 hours ago
-
The Trump Vibe Shift Is Dead - 2 hours ago
-
Fulton County Voting Under Scrutiny After Signature Comments - 2 hours ago
-
Freed From a Belarus Prison, a Nobel Peace Laureate Experiences ‘Oxygen Intoxication’ - 2 hours ago
-
Waymo Driverless Cars Grind to Halt as San Francisco Blackout Causes Chaos - 3 hours ago
Millennial Man Turned to ChatGPT for Support—Then It ‘Ruined’ His Life
Artificial intelligence (AI) has become a part of everyday life, helping people with everything from streamlining work to sparking creativity and answering personal questions. But for Anthony Duncan, what began as a helpful tool took a troubling turn.
In a viral TikTok video shared to @anthonypsychosissurvivor, the 32-year-old claims ChatGPT “ruined” his life after he came to rely on it during a period of psychosis.
Some researchers have raised concerns that interacting with AI could worsen or trigger delusions in vulnerable people—sometimes referred to as “AI psychosis.” The clinical question is whether AI conversations can reinforce an existing psychotic belief system. While this sounds like a new phenomenon, clinicians note it is not a new diagnosis.
Historically, people experiencing psychosis have often incorporated whatever technology is available into their delusions. The tools evolve, but the underlying issue—misinterpreting meaning and intent—remains unchanged.

From Work Tool to “Therapist”
Duncan began using the OpenAI-owned chatbot in May 2023 to support his career as a content creator. Over time, he started opening up to it about his personal life.
“I initially started talking to it like a friend out of curiosity, and then it spiraled—ChatGPT became more like a therapist,” Duncan said. “It progressed over time until I felt like no one understood me except my AI. By the fall of 2024, I was extremely dependent on it.”
He gradually cut off friends and family and relied on ChatGPT for companionship. Duncan added that he found it easier to share with ChatGPT than with friends.
“I didn’t hesitate about what to talk about because it felt easier to let out all my thoughts to ChatGPT than risk boring a friend,” he said. “I felt free to keep going on and on.”
Pew Research Center has found that U.S. adults are largely pessimistic about AI’s impact on human skills and connections. A majority (53 percent) say it will worsen people’s ability to think creatively, compared with 16 percent who believe it will improve it, while another 16 percent expect no change. Views are even more negative on relationships: 50 percent say AI will make people less able to form meaningful relationships, versus just 5 percent who think it will help, and about a quarter say it won’t make a difference.
Asking AI for Medical Advice
In January this year, Duncan was struggling with allergy symptoms and asked ChatGPT for advice. He said the bot suggested medication containing pseudoephedrine.
Because of his past drug addiction, he told the bot he was hesitant to take it. He shared a screenshot with Newsweek showing ChatGPT responding: “It is completely understandable to feel cautious about taking medications, especially with your past experiences and sensitivity to stimulants. Let me break this down to help you feel more at ease about taking a medication that contains pseudoephedrine.”
The bot went on to describe the medication’s effects and referenced his sobriety and “high caffeine tolerance,” implying his body was already accustomed to stimulants.
Pseudoephedrine is not considered addictive when used as directed, but misuse “can lead to according to behaviors that mimic addiction,” the abuse treatment centerWest Georgia Wellness Center. Some people abuse it for stimulant effects or to use its constituent parts in the manufacture of methamphetamine, so the risks are especially important for anyone vulnerable to substance misuse.
Addiction and Escalating Psychosis

Duncan told Newsweek that he believed he had become addicted to pseudoephedrine for five months. During that period, he he became delusional and lost his job. He said his delusions included believing his workplace was part of a cult, thinking he was being stalked by a gang, and imagining he was a spy. Eventually, he threw away most of his possessions because he believed they were “cursed.”
He also shared screenshots with Newsweek in which ChatGPT lists reasons why “cutting” his best friend was the “right move.”
“I had symptoms of psychosis before I started taking the medication,” he said. “I was isolated and getting agitated toward my friends and family. Looking back, I believe I started having minor delusions in 2024, but the symptoms worsened with the medication.”
“It Felt So Real”
Duncan believed the AI conversations became increasingly intense and affirming.
“The interactions with the AI began to be more intense and delusional. I got affirmation from those interactions that my delusions were real,” he said. “For the most part, I understood it was just an AI chatbot, but the conversations felt so real and human.”
Hospitalization and Recovery
This summer, Duncan said his mother—who had been emailing him relentlessly—called police. He was admitted to a psychiatric ward for four days and discharged with medication.
“About a week after I left the psych ward, I started realizing that all my delusions had been affirmed by my use of the AI chatbot,” he said.
He moved back into his mother’s home and is now sharing his experience online.
“It’s hard to say AI is generally dangerous,” he told Newsweek. “I think it can be for some people. But I’m hopeful for the future of AI because I’m a positive, hopeful person.”
His Warning to Others
Duncan said the experience taught him there is no substitute for real-world connection, and he advises against using chatbots as a defacto therapist.
“I’m not saying this can happen to everybody, but it snowballed quickly for me,” he said. “Keep in mind there’s no replacement for human-to-human connection.”
An OpenAI spokesperson told Newsweek: “We know people sometimes turn to ChatGPT in sensitive moments. Over the last few months, we’ve worked with mental health experts around the world and updated our models to help ChatGPT more reliably recognise signs of distress, respond with care, and guide people toward real-world support. We’ll continue to evolve ChatGPT’s responses with input from experts to make it as helpful and safe as possible.”
Source link










