Depositphotos
It seems that not everyone understands that ChatGPT is not a living person, a source of unquestionable knowledge, or a psychologist. People are turning to AI for advice in romantic relationships, and this is a bad idea.
Vice author Sammy Caramel was surprised to learn about this phenomenon and earlier this year wrote about a man whose girlfriend «constantly uses ChatGPT for therapy and asks him for relationship advice». What’s more, this girl even recalled what ChatGPT told her during arguments with her lover.
«At the time, I was a little stunned by this confession. I had no idea that people actually turn to artificial intelligence for advice, let alone opinions about their relationships,»,” Caramel writes.
Thereafter, the author immersed herself in researching the topic and realized how common this phenomenon is, «especially in an era when therapy has become a luxury». The ability of artificial intelligence language models to give this level of advice is questionable, to put it mildly, but people are actively using it. But does it help them?
A friend told Sammy that at first she tried to use chat as a «way to exchange ideas with an unbiased source». However, later she noticed that ChatGPT was still biased. It often only confirmed her experience, «perhaps even dangerously so».
On Reddit, there was a discussion of the responses of time as a confirmation and feeding of human illusions. Some people were inspired by this support, while others felt «persecuted by OpenAI». The chat support for even «manic ramblings» makes some wonder if ChatGPT in its current form is dangerous for people.
Earlier, ITC.ua has already written about chatbots feeding delusions of mentally unstable people, as well as about obsessive states and thoughts that were inspired by communication with him — even to the point of life-threatening. But there is another side: people’s relationships and life advice. Many narcissists, egomaniacs, and toxic people could get approval for their behavior.
«Now let’s be clear: AI cannot directly cause a breakup. Ultimately, you are the one who makes the decision. If you’re unhappy, dissatisfied, or being treated badly, you’ll definitely know in your heart that it’s time to leave,” Sammy Caramel says.
The author recalled her struggle with obsessive-compulsive disorder (OCD). She acknowledges that in her condition, ChatGPT’s advice, especially if she had not mentioned her OCD, could have been definitely detrimental. «I could have received unhelpful, even harmful, judgments about my relationship». In a thread about the disorder, she found evidence of cases where ChatGPT strongly advised against divorce. The official record of the NOCD, the OCD treatment and therapy service, provided clarifications that may seem obvious to some, but in reality, people do not always heed them:
«ChatGPT may seem to have all the answers, but understand that the engineers work hard to make the program’s answers sound authoritative, although in reality, this comes with many caveats. AI-based LLMs are not the most reliable programs. While they may give eloquent answers, they often «hallucinate», provide inaccurate information, and cite unrelated research».
So, it’s certainly not a good idea to turn to ChatGPT for life and relationship advice. A «sycophant» with «hallucinations» who is not really a person, has no experience and no qualifications can lead you to the abyss, even literally.