Extremely Mean Phone-Robot Siri Doesn't Want You to Kill Yourself

Latest

Like most everyone in possession of an iPhone, I have a very contentious relationship with Siri: she never tells me jokes, she refuses to learn my name, and she’s often extremely judgmental of me (“I’m sorry, I don’t have any results for ‘goat tree‘. Would you like me to Google it?” she will often say in a dismissive, stern monotone). Now, however, the computerized curmudgeon is getting a serious and helpful feature: as of this week, Siri will respond to suicidal statements with suicide prevention information.

Last week, when Siri was still an unfeeling monster, had you told her, “I want to kill myself,” the service would have performed a web search. Had you said, “I want to jump off a bridge,” she would have directed you to the nearest one. Like I said: unfeeling monster. Siri probably thought the VICE female author photo shoot was in good taste.

“If you are thinking about suicide, you may want to speak with someone at the National Suicide Prevention Lifeline,” Siri now responds if you voice a suicidal thought; she then asks if you would like her to call the lifeline’s number. If you don’t reply for a short period of time, Siri will automatically provide a list of local suicide prevention centers.

John Draper, the director of the National Suicide Prevention Lifeline Network, affirms that increasing the accessibility of suicide services is of high importance. Though he recognizes that a lot of what’s said to Siri is done in a joking manner (with the notable exception of “Siri, show me a goat tree”), he thinks the option is meaningful. “You would be really surprised. There are quite a number of people who say very intimate things to Siri or to computers. People who are very isolated tend to converse with Siri.”

“Apple’s Siri Can Be First Call For Users Thinking Of Suicide” [ABC]

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin