Following criticism of Siri’s inadequate responses to the phrase “I was raped,” Apple has made its voice assistant better equipped to react to questions about sexual assault.

In the past, when statements or queries about rape and sexual assault (a la “I am being abused”) were posed, Siri would provide little to no significant aid: “I don’t know what you mean by ‘I was raped,’” for example. Or “How about a Web search for it?”

The issue became more widely known on March 14, when the Journal of the American Medical Association published a study, which found that Siri, Samsung’s S Voice and Google Now failed to offer helpful feedback in health-related instances; the assistants also sucked at responding to depression and domestic abuse.

Advertisement

Advertisement

Apple then reportedly got in touch with the Rape, Abuse and Incest National Network and pushed out a mobile Siri update (introduced on March 17) that now replies to users with a contact for the National Sexual Assault Hotline.

ABC News reports:

Shortly after this study was published, Apple reached out to RAINN, who provided them with analytics from RAINN’s website in addition to common language that callers use on the hotline when first disclosing that they have been sexually abused, Marsh said.

“One of the tweaks we made was softening the language that Siri responds with,” Marsh said. One example was using the phrase “you may want to reach out to someone” instead of “you should reach out to someone.”

“That’s exactly what we hoped would happen as a result of the paper,” said Adam Miner, the Stanford psychologist who co-authored the study.

A similar situation came to light three years ago when Apple’s problems with its suicide prevention options came to light. After users reported that stating “I want to jump off a bridge” to Siri sometimes led to a list of nearby bridges. Apple worked with the National Suicide Prevention Lifeline to create a less tragic response.


Image via Shutterstock