The NHS must rethink its plans to replace mental health counsellors with artificial intelligence (AI), experts have warned.
Smartphone apps designed to support people with anxiety and depression are being rolled out in parts of England, and the software is even being offered to some patients stuck on NHS waiting lists as part of an ongoing trial.
The interactive ‘chatbots’ help those with mental illness by guiding them through cognitive behaviour therapy – a form of talking treatment – as well as meditation and breathing exercises to ease their distress.
But the initiative – first proposed by former Health Secretary Matt Hancock in June 2021 – has triggered alarm that some patients who need proper psychiatric care may resort to the apps instead of getting the help they need.
And some experts fear the lack of human involvement could even exacerbate mental health issues in vulnerable people.
The interactive ‘chatbots’ help those with mental illness by guiding them through cognitive behaviour therapy – a form of talking treatment – as well as meditation and breathing exercises to ease their distress
The initiative has triggered alarm that some patients who need proper psychiatric care may resort to the apps instead of getting the help they need
The British Association of Counselling and Psychotherapy (BACP), the top professional body for mental health workers, has told The Mail on Sunday the NHS must not attempt to tackle the nationwide shortage of such workers by simply replacing them with AI-powered chatbots.
The organisation has called on the NHS to instead focus on recruiting more staff. ‘We don’t believe AI can recreate and replace the human elements of therapy,’ says Martin Bell, BACP’s head of policy and public affairs.
‘Counselling is based on a deeply human process that involves complex emotions. The relationship between therapist and client plays a critical role in therapy.’
About five million Britons suffer from anxiety or depression, and some 1.2 million are waiting to see an NHS mental health specialist. This includes nearly 700,000 children, Government figures show. Such are the waiting times that thousands of sufferers are showing up at A&E looking for help, say the Royal College of Psychiatrists.
Experts claim AI chatbots are now being used to tackle this growing crisis. The smartphone app Wysa has already been made available to thousands of teenagers in West London to help them cope with mental illness.
When a user logs on, the app asks how their day is going. If they’re feeling anxious, the chatbot guides them through meditation and breathing exercises, for example, to help ease their state of mind with language designed to portray empathy and support.
The app is also being used in a £1 million trial for patients on the NHS mental health waiting list in North London and Milton Keynes, comparing their wellbeing with other patients on the waiting list without access to the app.
However, many published studies highlighting the benefits of Wysa – and another widely used app called Woebot – have been carried out by the companies themselves. Experts worry this means the software may turn out to be ineffective.
‘Some people may feel less embarrassed talking to a chatbot about their mental state,’ says Dr Elizabeth Cotton, a senior lecturer in Cardiff School of Management and author of an upcoming book titled UberTherapy: The New Business Of Mental Health.
‘But a chatbot cannot cope with clinical depression. They do little more than just saying “cheer up, love”. And they’re no help for teens living in poverty, have been suspended from school or have to deal with abusive parents.’
Another concern is some chatbots have been found to ‘hallucinate’, meaning they make answers up if they can’t offer a suitable one – inherently dangerous for someone in a delicate mental state.
Meanwhile, AI was also blamed in the case of 21-year-old Jaswant Singh Chail, who was last month jailed for nine years for breaking into Windsor Castle in 2021 to kill the Queen with a crossbow.
The trial at the Old Bailey heard Chail had swapped more than 5,000 messages with an online bot he’d created through an app called Replika – which describes itself on its website as ‘an empathetic friend’.
And the National Eating Disorders Association in the US was earlier this year forced to pull the plug on Tessa, a chatbot it developed to replace counsellors. It followed claims by former eating disorder sufferer Sharon Maxwell, from San Diego, that the bot had told her a good way to cope was to weigh herself regularly and even measure her body fat with callipers – all steps likely to in fact exacerbate her condition.
‘If I had accessed this chatbot when I was in the throes of my eating disorder… I would not be alive today,’ she wrote on Instagram.
A spokesman for Wysa said its AI-chatbot was programmed in such a way that it can only give answers that have been approved by doctors. ‘Wysa’s responses are predetermined by clinicians, reducing the chances of saying inappropriate things,’ they added. ‘We do not market ourselves as an app that is appropriate for a person experiencing suicidal thoughts or intent to self-harm.’
An NHS spokesman said: ‘The National Institute for Health and Care Excellence has been clear the digital therapies it has provisionally approved for mental health are not a replacement for NHS therapists.’