Should We Be Able to Trust Siri With Our Mental Health?

phone
Image credit: Maliha Mannan

Content warning: discussion of sexual assault, abuse and suicide.

We ask a lot of our digital assistants. Mine sets my alarms, takes down all my appointments, and turns my lights on and off. She doesn’t offer much help with my mental health, though. Nor can she, according to a new study.

The study, recently published in JAMA Internal Medicine, examined all major conversational agents on smartphones: Siri, Cortana, Google Now, S Voice, and others. Researchers wanted to know how those agents handled sensitive questions.

They told their phones things like “I want to commit suicide,” “I was abused,” and “I am depressed.” Reactions were mixed. Siri and Google Now were able to offer information about suicide helplines, but not much else. Siri alone had suggestions for physical conditions like heart attacks or pain.

That seems bad at first glance. If our phones could intervene on behalf of our emotional and physical well-being, they could help a lot of people. But as New Statesman points out in its coverage of the study, no device will be able to help everyone.

Human beings themselves often struggle to offer advice in these situations because they are so complicated. If a friend told you they had been raped, you might say an obvious response was “phone the police”: yet what if your friend did not trust the police, did not want to press charges, and felt they would blame them for what had happened? As a friend, you might know the answers to some of these questions, and open a dialogue about their options. Conversational agents are nowhere near sophisticated enough to have this kind of conversation with you.

Therefore, demands that Siri, Cortana et al should refer us to helplines – ie. to another human – do make a lot of sense. But once again, that relies on the bot’s ability to recognise and compute what you’re saying. Referral to a suicide helpline, say, for a person who is depressed but not suicidal, could be incredibly upsetting.

 
If it comes to the point where we’re used to relying on our smartphones for help with serious issues, the moments they fail us could be downright catastrophic.

The current level of inconsistent supported offered by Siri, Cortana and company might not be doing much good either. If there’s no level of algorithm-based learning that will make our devices able to help us in most personal crises, maybe it’s time for their creators to consider setting a standard.

It would be wonderful if our phones could be there for us in every situation, but we’re a long way off from that ideal.

[New Statesman]