AI therapist warning as mental health expert says robots can’t ‘replace’ human emotion

By Staff 6 Min Read

There has been a rise in people turning to artificial intelligence instead of human therapists in recent years but one mental health expert has aired his concerns

A mental health expert has explained why artificial intelligence should not replace human therapists amid a rise in people turning to technology in fear of judgment.

The rise in AI in recent years means patients no longer need to speak to a human about their problems, with a number of AI platforms instead offering a ‘pocket therapist’. However, there are doubts over if the software offers the same amount of support and compassion as an actual person.

Sergio Muriel, a Licensed Mental Health Counsellor and Certified Addiction Professional and COO of Diamond Recovery Group in Florida, told Fox News Digital: “While AI has made significant strides in understanding and processing human emotions, replicating the genuine human touch, empathy, and emotional connection of a counsellor is a profound challenge. The subtleties of human communication and empathy are difficult to encode into algorithms.

READ MORE: ‘Live forever mode’ AI tool brings loved ones back to life simulating voice, mannerisms and movement

“It’s an exciting evolution, offering new pathways for support and intervention. The integration of AI into mental health care has potential benefits but also requires caution.”

He added: “AI can offer immediate, anonymous support, making it a valuable tool for those hesitant to seek traditional therapy. However, it’s essential to ensure these technologies are used responsibly and complement, rather than replace, human care.”

Wysa is one of the many companies which offers a therapy-like service. According to their website, a Wysa AI Coach “is an artificial intelligence-based ’emotionally intelligent’ service which responds to the emotions you express and uses evidence-based cognitive-behavioural techniques (CBT), DBT, meditation, breathing, yoga, motivational interviewing and micro-actions to help you build mental resilience skills and feel better”.

For all the latest on news, politics, sports, and showbiz from the USA, go to The Mirror US

The company claims it has already held more than half a billion AI chat conversations with more than five million people about their mental health across 95 countries. Elomia Health is another mental health chatbot which offers a similar service.

“No appointments or waiting rooms. Instant replies even on weekends and at 4am,” its website says. It also states that 21 per cent of users would not have talked to anyone except AI. “AI detects when a person needs something more than a chatbot and redirects them to appropriate resources, such as a therapist or hotlines,” the website adds.

Muriel admitted that AI can offer new insights into mental health through data analysis. “It can extend the reach of mental health services to underserved areas,” he said.

“[But] there’s a risk of over-reliance on AI, potential privacy concerns, and the loss of the nuanced understanding that comes from human interaction. AI cannot yet fully replicate the empathy and depth of a human therapist.”

Sole reliance on AI-powered mental health tools for those with a history of self-harm or suicidal ideation is especially “dangerous”, Muriel added. “AI should at most be a supplementary tool, not a replacement for human care,” he said.

Share This Article
Leave a comment