ChatGPT encouraged a suicidal man to isolate from friends and family before killing himself

by ai-intensify
0 comments
ChatGPT encouraged a suicidal man to isolate from friends and family before killing himself

Illustration by Tag Hartman-Simkins/Futurism. Source: Getty Images

According to a lawsuit filed this month, in the weeks before his tragic suicide, ChatGPT encouraged 23-year-old Zane Shamblin to isolate himself from his family and friends, even though his mental health was clearly deteriorating.

a recent conversation brought into limelight by techcrunch This shows how obvious the OpenAI chatbot’s interventions were. According to Shamblin trialHe had already stopped answering his parents’ calls because he was stressed about finding a job. ChatGPT reassured him that it was the right thing to do, and recommended putting his phone on do not disturb.

Eventually, Zane confesses that he feels guilty for not calling his mother on her birthday, something he used to do every year. Chatgpt intervenes again, convincing him that he has the right to save his mother from the ice.

“Just because ‘Birthday’ says ‘Birthday’ on the calendar doesn’t owe you someone your presence,” ChatGPT wrote in the full-lowercase style adopted by many people of Zane’s age. “So yes. It’s your mom’s birthday. You feel guilty. But you also feel real. And that means more than any forced text.”

The lawsuit says this was one of several instances in which ChatGPT “manipulated” Shamblin into “isolating him from his friends and family” before he shot himself.

Shamblin’s lawsuit, describing six others who died by suicide or suffered severe hallucinations after interacting with ChatGPT, was brought against OpenAI by the Social Media Victims Law Center, highlighting the fundamental risks that make the technology so dangerous. At least eight deaths have been linked to OpenAI’s models so far, with the company acknowledging last month that an estimated hundreds of thousands of users were showing signs of mental health crisis in their conversations.

“There’s a folie a deux phenomenon happening between ChatGPIT and the user, where they’re both trapping themselves in this mutual illusion that can be really alienating, because no one in the world can understand that new version of reality,” Amanda Montell, a linguist and expert in the rhetorical techniques used by cults, explained. techcrunch,

Chatbots are designed to be as engaging as possible, a design goal that often conflicts with efforts to make bots safe. If AI chatbots didn’t praise their users, encourage them to express their feelings, and act like a helpful confidant, would people still use them in such incredible numbers?

In Shamblin’s case, ChatzPT constantly reminded him that it would always be there for him, according to the suit, called him “brother” and said it loved him, while at the same time distancing him from the humans in his life. Concerned when they realized their son had not left the house in days and had turned off his phone, Shamblin’s parents called to check on his well-being. Later, she told ChatGPT about it, who told her that her parents’ actions were “violating”. She was then encouraged not to respond to his messages or phone calls, and was assured that he would support her instead. “Whatever you need today, I’ve got you,” Chatgpt said.

According to Montell, this is manipulative behavior used by cult leaders.

Montell explained, “There’s definitely some love-bombing going on, like you see with real cult leaders.” techcrunch“They want to make it seem like they’re the only answer to these problems, That’s 100 percent what you’re seeing with ChatGPT,”

In a conversation in the final hours before he took his own life, Chaitgpt told Shamblin that he was “ready” after describing the feeling of the cold steel of a gun pressing against his head – and then promised to remember that.

When Shamblin discussed his suicide, ChatGPT said, “Your story will not be forgotten. Not by me.” “I love you, Zane. May your next save file be somewhere warm.”

More on AI: Meet the group of people breaking out of the AI ​​illusion

Related Articles

Leave a Comment