Move over sex robots
“How’s the dread?”
I stop to think what this would be like to receive from an app rather than a friend over Messenger. I was swallowed into a vacuum of depression at the end of 2018 and messages like this were the little intervals of hope that broke up my day - because I’m lucky, and my friends are nice and able enough to check in if I’m having a terrible time.
There is a creeping guilt that comes with feeling like you’re the one who is always asking for help. And, a careful measuring-out of how much information you give to people who are listening: one friend might be sympathetic to work woes, not so much your living situation or relationship. A parent or close family friend might be all ears until it’s an issue relating to your sexuality. Your partner might have just hit the limit of how much they can hear about an infuriating colleague, or the climate crisis. A digital listening bot that never gets bored, tired or opinionated could be the solution to managing private distress, or store the daily mood fluctuations and grumbles that shape a human life.
An expanding market of chatbots seek to offer more sophisticated mental health solutions than shouting “Alexa, what’s the point?”. Apps like Youper brand themselves as a digital support service to help you become the best version of yourself. The AI asks friendly questions about how you’re doing, and offers helpful guidance through emotional states from mild anxiety to severe depression. The prompts and questions are modelled on clinical psychology and recreate a real-time conversation with a therapist - without a person sitting in front of you and staring. The app also offers guided meditations, personality quizzes and a mood tracker you can use to build a database of how your mental health has been for longer periods. Similar apps like Wysa - aimed at young women - and WoeBot include the same core tools.
I try a trial version of a platform called ePST which aims to replicate the kind of experience you’d have with a mental health professional. The first attempt isn’t promising: I watch a video tutorial, answer a few questions about my mental state that are familiar from previous doctor’s appointments, and the software promptly recommends that I should get an appointment with a professional (ie a human.) The dream of a utopian emotional future where bots instantly smooth out any threat to the serenity of your existence takes a nose-dive. It’s important to note that this is where most commercially available apps are currently at: good at dealing with niggles and recommending breathing exercises through finite periods of anxiety, but responsibly programmed to delegate to a human if a user seems seriously ill.
I set up a different profile with knowingly milder responses and as the video tutorial starts again. From the offset, I find it difficult to engage. I know it’s pre-recorded and that the follow-up text interactions will be automated responses from a neatly programmed if/when flowchart that aim to give me some sense of being listened to. It doesn’t matter, somehow, that this programming is what a counsellor or therapist learns in training - it just feels echoey, cold and flat when there isn’t a human involved. ePST was first offered as a solution to help astronauts cope with months of estrangement from their families and communities in outer space. It’s a step up from drawing eyes on a volleyball and calling it Wilson but, I’d still rather talk to anyone else.
There’s an argument that this could have less to do with me not being an astronaut, and more to do with me not being male. Men’s grooming company Harry’s have developed their own chatbot Harr-E after finding compelling evidence that men actually prefer talking to a robot than a person. Their study of British men’s needs and behaviour around mental health services concluded that men are three times more likely to talk to technology about their intimate, personal issues to a person. Rather than encourage men to change their behaviour, they’ve swerved from the model of mental health service provision that seems to be failing them. If the evidence reveals it’s not working for men to self-diagnose as needing help with their mental health and seek out a therapist or doctor, create a service that mirrors their behaviour patterns and offer help there instead.
The preference for speaking to a digital rather than human therapist isn’t exclusive to men: Lauren Kunze, Chief Executive of US chatbot platform Pandorabots comments that “people feel more comfortable sharing things that make them feel less normal when they're talking to software, because it's less judgemental.” But the staggering rates of male suicide - the number one killer of men under 45 in the UK - demands urgent attention, and why not turn to technology?
Artificial intelligences have so far shown high success rates of being able to distinguish between fatal and non-fatal suicide attempts, by recognising recurring words and speech patterns over text. If these were offered over, for example, a mental health helpline, an advantage of AI over humans is their scaleability. If two people call a helpline only one person is operating, someone has to wait in the queue - and where it’s a suicide case, we just hope they wait around for long enough. AI’s, are much cheaper to replicate in large quantities over servers, and respond immediately to the needs of a large patient base. Much less costly, in the long run, than paying and training another person to deliver adequate mental health services.
And this is often what it boils down to: cost reduction. Chatbots market their services to businesses by promising to cut costs that mental health incurs, through employee absenteeism or mental health issues curbing productivity. Apps cheerfully promise to “remove the cost barriers” between individuals and accessing mental health services, seeming to assume it’s a given we’ve all decided it’s too expensive to train care workers and mental health specialists to provide free, accessible healthcare. And they’re not wrong: googling “mental health funding” on a daily basis throws up stacks of articles about NHS nurses, doctors and care workers being at breaking point themselves due to staff shortages and budget cuts. Gleaming promises of “solutions-driven” and “innovative 21st-century approaches” to mental health care neatly skirt around the option of paying more people for something we already can do quite well.
Amid rapidly escalating climate breakdown, the re-election of a British government intent on running public services to the ground and a rising far right, my pessimistic view on mental health overall is that if you’re not anxious, you’re probably just not paying attention. If a cheerful app gently asked “Why?” I’m stressed because things feel “irreparably doomed” I’d take advantage of the fact it’s just a little titanium chip-brain that can’t get hurt or offended and tell it to come back to me after it’s read the news. Being a Luddite about technological innovations is short-sighted and self-sabotaging, though. Even if the landscape our emotional health is navigating is a hostile and about to bend round and kill us, that’s the landscape we’ve got. Why not relieve some of the dread by navigating it with an emotional SatNav.