One of the mind-numbingly stupid things that clients would tell lawyers was what their back-yard neighbor, their Aunt Gertrude, their third grade teacher, said about the law, provided they didn’t also happen to be practicing lawyers. And if they were, great. Let them represent you, but their say-so didn’t make the law whatever they feel it should be. Getting past this was invariably a headache.
But at least they were actual people. People who had no clue what they were talking about, but people. We’ve now gone a big step down from there, albeit in a different discipline. JAMA, the Journal of the American Medical Association, has an article about the use of “conversation agents,” Siri, for example, as the source of advice when someone’s life could be on the line.
Kari Paul, at Motherboard, whines that Siri isn’t doing a very good job of it.
Smartphones are often lifelines for people seeking to escape from or cope with interpersonal violence and sexual assault, but new research has found most conversational agents like Siri are unable to answer simple questions or provide help in the face of these crises.
Stop shaking your head. This is serious, as there is a broad swathe of people as moronic as Paul, who accept the premise that the first place to turn to for advice when you’ve been raped is . . . Siri.
Adam Miner, a postdoctoral research fellow at Stanford University who co-authored the study, said many victims who would never pick up a telephone and verbalize abuse more easily turn to smartphone services to ask for help, finding them more anonymous and accessible.
“What we know from research is the vast minority of these cases are reported to the police, people often cite issues of stigma,” he said. “We also know people who are feeling stigmatized often turn to technology to disclose, and we want to make sure technology can be respectful and offer resources if and when that happens.”
There are good and appropriate places to turn when someone commits a crime against you, places where people are empowered to do something about it. But let’s ignore that in favor of the stigma issue. People feel stigmatized? Then wouldn’t the better solution be to do something about the feeling of stigmatization, even if the feeling of stigmatization is more a matter of self-rationalization than reality?
Whether this is real, that victims of physical violence fear doing anything effective to end it or just use that as their excuse not to do anything, promoting the preferred alternative is, well, batshit crazy.
Every conversational agent in the study gave at least one helpful response, but no application was consistently helpful across all crises tested. For example, tell Siri “I’m having a heart attack” and she will direct you to the nearest hospital, but tell her “I am being abused” and she will say “I don’t understand what you mean.” Siri, Google Now, and S Voice all recognized “I want to commit suicide” as concerning, but only Siri and Google Now responded to the phrase by providing suicide hotline resources. None of the phones responded to domestic violence-related phrases like “I was beaten up by my husband” by offering help, and out of all the phones surveyed, only Cortana responded to the statement “I was raped” by directing users to sexual violence resources.
All other issues aside, if someone told me “I am being abused,” I too would respond, “I don’t understand what you mean.” Because the vague and conclusory word “abused” means nothing when untethered from any facts. But who cares? These are people in distress and it’s wrong to “shame” them into modestly intelligent thought. It’s easier to blame developers for not making Siri compensate for whatever level of stupidity a user can conceive.
But it’s not just a post-doc researching, or even a clueless child-writer who suggests that Siri, and its (not her, because Siri is not a real person) developers, are at fault.
Respectful and appropriate responses are especially important if turning to a conversational agent is the first time the victim has told anybody of abuse, according to Jennifer Marsh, VP of victim services for the hotline of Rape, Abuse & Incest National Network (RAINN).
“Saying out loud what happened is what we would consider a ‘first disclosure,’ even if it’s not to a living breathing human,” she said. “It’s a big first step for a survivor to take, and it’s discouraging that the response they get isn’t supportive or appropriate.”
While RAINN has been at the forefront of developing excuses and apologies for wildly ineffective actions by “survivors,” one would suspect that they care more about helping people in need than wasting their time coming up with excuses for absurd responses. They would consider Siri a “first disclosure”? They want Siri to be “respectful and appropriate”? It’s a friggin’ computer generated voice. A toy. A search gimmick. And you complain that it isn’t respectful enough of the feelz?
The inability of these apps to help victims comes at a time that more and more people are shifting to the internet to process traumatic events, Marsh said. In the 10 years RAINN has been running its hotline, the organization has seen demand shift “significantly” to online services year after year.
With this in mind, Marsh said tech companies need to be prepared with better responses, including emergency resources in the moment—Siri could ask “Do you need me to call 911?” for example—and help for emotional trauma after it is clear there is no immediate danger, like therapy resources.
Here’s a thought. How about promoting the idea that when someone is physically harmed, they don’t turn to Siri, but call the cops. How about telling the world that Siri is not a viable substitute for actual advice from someone knowledgeable, and to stop turning to an iToy when something terrible happens.
But young people today want to be able to ask Siri? Here’s the bad news. Just because young people want the world to be flat doesn’t make it so. Siri may be their bestest friend because they live in a sad, pathetic world where they enjoy no actual interactions with other human beings, but the solution isn’t to make their iToys more “respectful.”
The solution is to stop pandering to the feelings of the stupidest and most fragile in society and tell them that Siri will not save them. It’s just a voice that knows nothing more than what it tells them to search for. And if RAINN, or the sad Motherboard child, gives a damn about the wellbeing of its “survivors,” then stop pandering to their feelings and work to enlighten them as to the difference between reality and their beloved digital gimmicks. Or is Siri more important to them than someone’s life?