<
https://www.404media.co/ai-psychosis-help-gemini-chatgpt-claude-chatbot-delusions/>
"When David saw his friend Michael’s social media post asking for a second
opinion on a programming project, he offered to take a look.
“He sent me some of the code, and none of it made sense, none of it ran
correctly. Or if it did run, it didn't do anything,” David told me. David and
his friend’s names have been changed in this story to protect their privacy.
“So I'm like, ‘What is this? Can you give me more context about this?’ And
Michael’s like, ‘Oh, yeah, I've been messing around with ChatGPT a lot.’”
Michael then sent David thousands of pages of ChatGPT conversations, much of it
lines of code that didn’t work. Interspersed in the ChatGPT code were musings
about spirituality and quantum physics, tetrahedral structures, base particles,
and multi-dimensional interactions. “It's very like, woo woo,” David told me.
“And we ended up having this interesting conversation about, how do you know
that ChatGPT isn't lying?”
As their conversation turned from broken code to physics concepts and quantum
entanglement, David realized something was very wrong. Talking to his friend —
whom he’d shared many deep conversations with over the years, unpacking matters
of religion and theories about the world and how people perceive it — suddenly
felt like talking to a cultist. Michael thought he, through ChatGPT, discovered
a critical flaw in humanity’s understanding of physics.
“ChatGPT had convinced him that all of this was so obviously true,” David said.
“The way he spoke about it was as if it were obvious. Genuinely, I felt like I
was talking to a cult member.”
But at the time, David didn’t have a way to name, or even describe, what his
friend was experiencing. Once he started hearing the phrase “AI psychosis” to
describe other peoples’ problematic relationships with chatbots, he wondered if
that’s what was happening to Michael. His friend was clearly grappling with
some kind of delusion related to what the chatbot was telling him. But there’s
no handbook or program for how to talk to a friend or family member in that
situation. Having encountered these kinds of conversations myself and feeling
similarly uncertain, I talked to mental health experts about how to talk to
someone who appears to be embracing delusional ideas after spending too much
time with a chatbot."
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics