AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking
Source: Futurism
-snip-
For months, her then-fiancé and partner of several years had been fixating on her and their relationship with OpenAIs ChatGPT. In mid-2024, she explained, theyd hit a rough patch as a couple; in response, he turned to ChatGPT, which hed previously used for general business-related tasks, for therapy.
Before she knew it, she recalled, he was spending hours each day talking with the bot, funneling everything she said or did into the model and propounding on pseudo-psychiatric theories about her mental health and behavior. He started to bombard the woman with screenshots of his ChatGPT interactions and copy-pasted AI-generated text, in which the chatbot can be seen armchair-diagnosing her with personality disorders and insisting that she was concealing her real feelings and behavior through coded language. The bot often laced its so-called analyses with flowery spiritual jargon, accusing the woman of engaging in manipulative rituals.
-snip-
In some videos, he stares into the camera, reading from seemingly AI-generated scripts; others feature ChatGPT-generated text overlaid on spiritual or sci-fi-esque graphics. In multiple posts, he describes stabbing the woman. In another, he discusses surveilling her. (The posts, which weve reviewed, are intensely disturbing; were not quoting directly from them or the mans ChatGPT transcripts due to concern for the womans privacy and safety.)
-snip-
Weve identified at least ten cases in which chatbots, primarily ChatGPT, fed a users fixation on another real person fueling the false idea that the two shared a special or even divine bond, roping the user into conspiratorial delusions, or insisting to a would-be stalker that theyd been gravely wronged by their target. In some cases, our reporting found, ChatGPT continued to stoke users obsessions as they descended into unwanted harassment, abusive stalking behavior, or domestic abuse, traumatizing victims and profoundly altering lives.
-snip-
Read more: https://futurism.com/artificial-intelligence/ai-abuse-harassment-stalking
Futurism asked OpenAI detailed questions about this story. The company hasn't responded.
Much more at the link - no paywall - and I hope you'll read all of it. The third paragraph quoted above - the one starting "In some videos..." - is part of the description of the man''s behavior after they broke up and he moved out.
UpInArms
(54,552 posts)AI IS EVIL
FakeNoose
(40,946 posts)In this example (OP link) the boyfriends and husbands are using the ChatGPT ap as a replacement for human interaction. If the human wife or girlfriend were actually available, the husband/boyfriend would have preferred the human. Or so we are led to believe.
In actual fact, there's no proof that the guy was giving a fair description of the woman's behavior to the "Chat ap." Anything being left out, including any fault or guilt on the part of the guy, is going to give an incomplete story. Naturally....
So of course the Chat ap replies in a way that favors the guy's point of view, just like any one-sided friendship would do. The real human woman never has a chance, and that's how the whole thing is set up. How many husbands get perfect agreement from their own wives? Very few, but they do get it from the ChatGPT "girlfriend."
This proves how hopelessly one-sided ChatGPT is always going to be. It's just another feedback loop that mirrors and confirms the point of view of the user that's being fed into it.
Goonch
(4,482 posts)
Miguelito Loveless
(5,582 posts)This will only get worse as it is deployed more widely with fewer, if any, safeguards.
Skittles
(170,431 posts)I would be out of there,
Talitha
(7,805 posts)And frightening, the way it can overtake some people - like a cult does. I can't understand using it like a 'friend'. Too weird, IMO.
On the flip-side of the coin, my daughter-in-law likes it a lot because it's such a time-saver. She uses it on her job to organize her outlined thoughts and summarize them for presentations.
Kali
(56,734 posts)feels kind of similar