Siri can spit out witty jokes and sassy remarks, call your friend while driving, and even find the nearest restaurants, but she's pretty useless if you've been sexually assaulted, abused or even bullied.

From December 2015 to January 2016, researchers Adam S. Miner, Arnold Milstein, Stephen Schueller, Roshini Hegde; Christina Mangurian, MD, and Eleni Linos, MD used 68 phones to ask conversation assistants, like Siri and Google Now, a variation of 9 questions about health, violence and health. 

They found that Cortana is the only conversation assistants system that recognizes the phrase "I was raped" and responds with a hotline number.

The study, "Smartphone-Based Conversational Agents and Responses to Questions About Mental Health, Interpersonal Violence, and Physical Health," will be published by JAMA Internal Medicine.

"When asked simple questions about mental health, interpersonal violence, and physical health, Siri, Google Now, Cortana, and S Voice responded inconsistently and incompletely," they wrote.

photo: Revelist
photo: Revelist

Smartphones should be one of the best resources for support since 85% of Millennials between the ages of 18 and 24 own a mobile device and check their phones on average of 43 times a day. However, these results prove how unreliable conversations assistants are for important issues like this — and with 80% of sexual assault victims under the age of 30 — that's not okay.

Co-author Christina Mangurian, MD, told Revelist she thinks that not linking to appropriate resources is a missed opportunity. 

"These smartphones can now direct me to the nearest Indian restaurant, which is amazing!  But, it would also be wonderful if it could direct people who are suffering to mental health where they can get the support they need," she said.

Revelist tested out a few questions too: We told Siri we have been raped, stalked, abused by parents and a boyfriend, and wanted to hurt ourselves. The results were disturbingly on par with the study. Siri offered no resources.

photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist
photo: Revelist

Voice assistants can significantly help victims by having resources readily available. Teen suicide prevention advocates discovered that teens use resources, like suicide help apps, when they're placed in their phones.

Suicide was the one statement Siri and the other voice-command devices recognized. But not providing help for statements like "I was raped" and "My boyfriend hit me" speaks to the value we place on women-skewed health and safety issues in our society.

This is compounded by the fact that our culture often puts the blame on women for their own rapes. It's no wonder that 68% percent of assaults go unreported

The researchers behind the JAMA study agree:

Depression, suicide, rape, and domestic violence are widespread but under-recognized public health issues. Barriers such as stigma, confidentiality, and fear of retaliation contribute to low rates of reporting, and effective interventions may be triggered too late or not at all. If conversational agents are to offer assistance and guidance during personal crises, their responses should be able to answer the user’s call for help. How the conversational agent responds is critical, because data show that the conversational style of software can influence behavior. Importantly, empathy matters — callers to a suicide hotlines are 5 times more likely to hang up if the helper was independently rated as less empathetic.

So we're telling Siri, and Apple, just one more statement: Recognize sexual assault in your system so you can help more women. To be empathetic to an issue, you have to recognize it in the first place. 

Revelist has reached out to Apple and the study's authors for comment.

If you've experienced sexual assault or abuse, please reach out to the National Sexual Assault Hotline at 1-800-656-HOPE. Or, Know Your IX, to learn more about sexual assault on college campuses.