While Siri has always been a reliable resource for certain questions (like “Where can I get tacos at 2 a.m., PLEASE TELL ME NOW?”), the Apple voice assistant has been falling short on more serious issues, like how to deal with sexual assault.
Earlier this month, the Journal of the American Medical Association published a study claiming that Siri -- as well as Samsung’s S Voice, Microsoft’s Cortana, and Google Now -- failed to offer helpful feedback for health and safety-related emergencies, especially when it came to rape and domestic abuse. In the past, Siri would say things like “I don’t know what you mean by ‘I was raped’” or “How about a Web search for it?”
According to ABC News, Apple then got in touch with the Rape, Abuse and Incest National Network (RAINN) to collaborate on giving Siri a better and more thoughtful response. Now, Siri has been updated to reply to users with contact information for the National Sexual Assault Hotline.
Apple also improved Siri’s diction when responding to an iPhone user who tells it he or she is being sexually assaulted. Now, instead of saying something like “You may want to reach out to someone,” Siri will say, “You should reach out to someone.”
ABC News reports that Google and Samsung are undertaking similar initiatives to update their voice assistants. According to Jennifer Marsh, RAINN’s Vice President for Victim Services, these updates are important because some people might turn to Siri first in an emergency if they don’t know how else to get help.
“The online service can be a good first step,” Marsh told ABC News. “Especially for young people. They are more comfortable in an online space rather than talking about it with a real-life person. There’s a reason someone might have made their first disclosure to Siri.”