Freelance

Author Bio ▼

Jamie Hailstone is a freelance journalist and author, who has also contributed to numerous national business titles including Utility Week, the Municipal Journal, Environment Journal and consumer titles such as Classic Rock.
September 17, 2018

Get the SHP newsletter

Daily health and safety news, job alerts and resources

Mental Health

The chatbot will see you now…

Chatbots might be an occupational hazard if you are trying to do the weekly internet shop, but they are also increasingly playing an important role in helping people deal with depression, anxiety and other mental health issues.

Over the last couple of years, a plethora of Artificial Intelligence (AI) systems have been launched to offer text-based therapy and support at any time of the day from the comfort of your smartphone, tablet or computer.

With the issue of mental health among both adults and children continuing to dominate the headlines and NHS services under increasing pressure, many see chatbots as the answer to help alleviate many people’s anxieties.

chatbot

Speaking to SHP Online, Mark Stephen Meadows, co-founder of the SEED Project, which is an independent, decentralised marketplace for developers of conversational user interfaces (CUIs), says chatbots can “soak up much more information than we can”.

“Emotions may be measured, tracked and even predicted,” says Mr Meadows. “We should be cautious however, about discussing these systems in terms of ‘getting to know us’. There’s a fine line between systems that make our lives easier and ones that manipulate, and this applies to our mental health as much as anything.”

“We always recommend bots be designed so they don’t resemble humans. Avatars and videobots we’ve built have commonly have a cartoon style so as to, first, avoid something called ‘The Uncanny Valley’, but also so that we are not introducing duplicitous systems that can mimic people and pretend to be someone they are not as they begin to present a risk to identity theft,” he adds.

Personal information

Mr Meadows says many patients also feel safer and more willing to divulge personal information to chatbots or AI systems, without feeling judged.

“The system is patient,” he adds. “Immense data is collected. Also there is that deep capability to understand the individual and customize responses based on interests and emotion. As I said, CUIs have many advantages over traditional, analogue communication as it can absorb, process, store and predict much faster and more efficiently than we, as humans, have traditionally.

“At this moment, while still evolving, AI is still at an early stage in its development. We still have the opportunity to build fairness and transparency into these systems, which is the motivation behind the SEED project, and something that will become particularly important if we are to allow AIs to assist in maintaining our health and well-being. If we do so, there’s no reason these systems can’t be complimentary to the methods and techniques we have already,” adds Mr Meadows.

Engaging

The Head of Digital at the mental health charity Mind, Eve Critchley says the vast majority of people who experience mental health problems still do not access any formal support.

“New types of online and app-based support can increase the support options available, especially for people who – for whatever reason – are less able to engage with face-to-face services, which is welcome,” says Ms Critchley.

“There are risks however, as none of this is regulated. If you’re thinking of using a new app or website, we’d always suggest doing some research first and perhaps getting a recommendation from a registered counsellor or another mental health professional.

“Advice and information provided by organisations who have been given the Information Standard certificate are also a good place to start, as the information will be clear, accurate, evidence-based, up-to-date and easy to use,” she adds.

“It’s vital that digital platforms are properly maintained, managed and, where appropriate, moderated, to ensure that they are safe. We also want to see app builders engaging with people with mental health problems and taking their experiences on board when designing or redeveloping apps as the user community have the best handle on what will be of benefit to them and the expertise to say what works and what doesn’t,” says Ms Critchley.

Emotional resilience

The UK clinical lead for the Wysa app, Emma Selby, which responds to the emotions you express and uses evidence-based cognitive-behavioural techniques (CBT) to help you build emotional resilience, says its particularly useful for teenagers who “don’t talk on the phone for anything”.

“They really like that kind of text-based system and if you look at when people do contact helplines, it’s never between nine and five, it’s at 11 o’clock at night when their mind is running like crazy,” says Ms Welby.

She adds the clinical content on WYSA is similar to many low-level NHS services, which are becoming harder and harder to access.

chatbot“AI can fill this gap we have in early intervention and that’s where a lot of the testing in AI is now going,” she says. “If we apply this as a generic tool that we expect everyone to use in order to help them learn more about emotional resilience, we can decrease the number of people who need to be in specialist mental health services when they cannot cope.

“We have activities on WYSA about working out whether the anxious thought you are having is a realistic thought, or just one that has run away with you,” explains Ms Welby.

“Everything is underpinned by the clinical ideas of building emotional resilience and the cognitive reframing that comes with it. They are written by either myself or a full psychology team, which is based at Wysa’s base in India.”

And like most AI systems, WYSA has adapted to understand regional variations and slang.

“When I first started Wysa two years ago, it could not understand slang,” says Ms Welby. “If you put in ‘I feel like falling asleep and never waking up’, it would offer you offer you sleep advice, because it’s not a phrase they use in India. Now it has had a million conversations across the world, it has learnt that in Essex it could mean something quite different, so the clinical content shifts.

“People worry that a robot will miss things, but I do think there is huge scope for AI in mental health,” she adds. “We’ve seen some fantastic campaigns about reducing the stigma around mental health, but we still don’t have enough places where people can then go to have a conversation about mental health.

“We don’t have the middle ground where you can talk to someone around mental health and they would have an element of knowledge to support you. In five years’ time it could be commonplace for people to have these apps on their phones and say ‘I had a really crap day, my boss yelled at me’ and be able to reflect. It could do huge things for emotional resilience.”

New health and safety game-based tool launched

What makes us susceptible to burnout?

In this episode  of the Safety & Health Podcast, ‘Burnout, stress and being human’, Heather Beach is joined by Stacy Thomson to discuss burnout, perfectionism and how to deal with burnout as an individual, as management and as an organisation.

We provide an insight on how to tackle burnout and why mental health is such a taboo subject, particularly in the workplace.

stress

Related Topics

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments