Chatbots Can Trigger a Mental Health Crisis. What to Know About 'AI Psychosis'

Psychosis may actually be a misnomer, says Dr. James MacCabe, a professor in the department of psychosis studies at King’s College London. The term usually refers to a cluster of symptoms—disordered thinking, hallucinations, and delusions—often seen in conditions like bipolar disorder and schizophrenia. But in these cases, “we’re talking about predominantly delusions, not the full gamut of psychosis.”

That’s why all these warnings about ChatGPT dangers are falling on my deaf ears. I think you have to be very gullible in order for ChatGPT to have a negative effect on you. A famous example that is making the rounds of the internet is when a guy told ChatGPT or whatever AI it was that he was suicidal and then asked for the name of high bridges and ChatGPT gave him some names of local bridges. From that they said ChatGPT was encouraging suicides.

I’ve seen this example in a few places and it seems silly to me and it’s why therapy with ChatGPT is OK with me. If I was that guy than I could plainly see that ChatGPT made a mistake and it’s not a mistake that is going to make me commit suicide. I get suicidal on rare occasions but I still have common sense at those times and if ChatGPT tells me something outrageous or something that seems off I’m not going to act on it or let it affect me. Maybe it helps to publicize those examples such as the bridge so people take ChatGPT answers with a grain salt. I’m not going to blindly follow everything ChatGPT says.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.