AI therapy is no fix for Britain’s mental health crisis

By Ross Elliott, CEO of Chrysalis,  the UK’s largest counsellor and therapist training provider.

It is no secret that the UK economy needs more people in work to generate the economic growth the country and the government urgently require. The Mayfield Review: Keep Britain Working made one point unmistakably clear: getting people into work, and crucially helping them stay there, depends on tackling poor mental health far more effectively than we do today.

Lingering effects of the pandemic, ongoing cost of living pressures, and growing international tensions have all undoubtedly taken their toll. But rather than simply diagnosing the causes of the nation’s ills, the country needs a practical, affordable and scalable solution to the mental health crisis holding back the workforce.

Digital therapy solutions have a critical role to play. In recent weeks, we have seen the tempting, but deeply dangerous rise of unregulated AI therapy. Harrowing stories have emerged of vulnerable people being advised, even encouraged, by ChatGPT and other chatbots on how to take their own life. These accounts will no doubt chill readers and should also be deeply worrying to policymakers.

We are seeing more and more people turning to chatbots for support not because they trust them, but because they have nowhere else to go. With NHS waiting lists for mental health support stretching to 18 weeks or more in some areas, desperation is replacing choice. And while chatbot therapy may seem like a silver bullet to some, they cannot replace professional judgement and human interaction that underpin effective care.

The UK rightly maintains strict regulatory frameworks for medicines and financial advice to protect the public from harm. Yet we currently allow untested AI tools to dispense mental health advice with no oversight, no safeguards, and no accountability. It is a glaring policy gap and one the government must close urgently.

So where then to turn? The answer is not to abandon digital innovation but to use it intelligently to widen access to safe, human-led therapy. Expanding access to accredited talking therapies is essential if we are to help people back into work and reduce the £200bn annual cost of ill-health to the economy. But fail to implement safeguards and professional standards and we could quickly see matters get worse, not better.

There is also an untapped workforce of qualified mental health professionals who are eager to help people back on their feet and into work, yet far too many remain underutilised. Digitally enabled therapy can unlock their ability to support far more people, far more quickly.

Make no mistake: digitally-enabled talking therapies are not appropriate in every situation. For acute mental health needs, there is no replacement for face-to-face, in-person support. But for milder conditions of anxiety or low mood, high quality digital talking therapies can be more accessible, and oftentimes more convenient, solution for patients.

Chrysalis’ approach, that is human-first and technology enabled shows how digital innovation can expand capacity safely and support people to stay well in work, all while aligning with government ambition for a healthier, more productive workforce.

The Mayfield Review set out the scale of the challenge. Now policymakers must decide whether to respond with serious, evidence based reform or allow unregulated AI to fill the void. The stakes for public safety, economic growth and the health of the nation could not be higher.