ChatGPT app icon on a phone screen with Alejandra Enríquez in a circle and a red warning bubble with an exclamation mark

Alejandra Enríquez, psychologist, talks about ChatGPT: 'It's one of the worst decisions…'

Psychologist Alejandra Enríquez issues a warning about ChatGPT's role in the emotional sphere

More and more people are turning to artificial intelligence as a kind of emotional alternative. Psychologist Alejandra Enríquez has expressed harsh criticism against one of these recent trends: using ChatGPT as if it were a therapist. In a video that has caused notable buzz on social media, the specialist wanted to issue a clear warning.

The professional has pointed out that this trend of consulting a machine to solve emotional problems is not only dangerous, but it can bring more harm than good. Enríquez has emphasized that, although digital tools can be useful for obtaining information, they are not prepared to accompany a person through psychological processes. According to her, turning ChatGPT into our confidant is a serious mistake.

Person using ChatGPT on a mobile phone in front of a computer screen that also displays the ChatGPT interface
Trusting your problems to ChatGPT is not a safe option | Shutterstock

ChatGPT can't replace the therapist's empathy

In fact, the psychologist has stated that "it's one of the worst decisions we can make" when we use this technology to replace professional help. This statement highlights the importance of distinguishing between the correct use of AI and excessive trust in it for matters as delicate as mental health. Enríquez has warned that the lack of empathy and real understanding in a machine can make problems worse.

The rise in the use of ChatGPT among young people has caused concern, since many believe that talking to artificial intelligence offers more privacy and immediate answers. However, Alejandra has insisted that psychological consultation must have a human component, where active listening and emotional interpretation are key. Machines can't replicate this essential dimension.

Two women sitting and talking in a brightly lit room while one of them gestures with her hands
Therapy requires active listening and emotional understanding | Getty Images

The risks of using ChatGPT as therapy

For this reason, she has warned about the dangers of telling personal problems to ChatGPT, since it could reinforce negative ideas and worsen the situation that worries the person. According to the psychologist, artificial intelligence is not qualified to solve emotional problems and doesn't have the ability to understand who the user really is. This warning underscores the risks of trusting a system that lacks context and sensitivity.

The impact of this trend opens a necessary debate about how to integrate technology into emotional care without losing sight of the limits. For the psychologist, therapy is a process that requires professionalism, training, and above all, humanity. Artificial intelligence, although advanced, can't replace the value of personal support.

Alejandra Enríquez has made her recommendation clear not to replace consultation with experts with conversations with ChatGPT. The machine can help answer questions or provide data, but it is not designed to address the complexities of the human mind. Mental health deserves respectful and professional treatment, and blindly trusting AI can be a mistake with serious consequences.