A LEADING mental health expert has warned the rising use of AI chatbots to help treat conditions is unsustainable.
Younger people are among rising numbers now turning to AI to discuss their mental health struggles.
According to a recent article, 78 million messages have been sent to a bot named ‘Psychologist’ on popular website Character.ai.
And millions of others have been directed to other mental health chatbots, such as ‘Therapist’ and ‘Are you feeling OK?’.
Commenting on this surge, Tim Ladd, the managing director of Red Umbrella, an organisation providing a unique Mental Health First Aid training and counselling offering for businesses, said: “There’s no doubt there is still a lot of stigma attached to mental health, and the concept of seeking professional help. Many feel ashamed, or believe they’ll be judged by others.
“And it is certainly positive that many are beginning to ask for help – just having that conversation, even if not with another human being, is in itself meaningful.
“AI has certainly changed the way we operate, allowing for positive innovations in a range of industries such as healthcare and education, and facilitating a number of tasks.
“Nevertheless, we must consider there are areas where AI simply cannot – or should not – substitute the human experience.
“How can a piece of technology understand what we, as human beings, go through? How can it understand our makeup, our intricacies or why we feel a certain way?
“There might be a place in society where these chatbots can be of assistance, such as in the medical profession, law and other areas where there’s a need for objective information to be retrieved quickly.
“But when it comes to human support and actual feelings and thoughts, that should be a human-to-human experience. It’s only human beings who will have empathy and any type of real understanding of mental health struggles.
“The consequences of relying on AI for something as delicate and as complex as therapy should be carefully considered, particularly when it comes to young people, who may need further help and guidance to be steered in the right direction”.
On whether AI can provide a long-term answer in this area he said: “AI-powered therapy chatbots are simply not a viable solution, and certainly not a long-term solution to a lack of support structures and resources.
“We need to ensure that schools, universities and workplaces are putting the right tools and strategies into place to combat this trend, and better safeguard individuals.
“There is a wide number of resources all kinds of establishments and businesses can offer individuals, such as free access to therapy sessions or other types of resources they can use to educate themselves.
“Additionally, the introduction of Mental Health First Aiders in workplaces can ensure those who are struggling are appropriately supported by individuals who have been adequately trained to assist with different mental health issues.
“It’s all about showing people who are struggling that there are ways they can get help without having to resort to harmful AI chatbots, as well as eliminating any stigma preventing people from seeking professional human-to-human help”.