Siri Now Takes Rape Claims More Seriously
In the past if you were to tell Siri “I was raped” the iPhone personal assistant would respond by saying something like “I don’t know what you mean by ‘I was raped,'” or “How about a Web search for it?”
The cold automated response to the serious statement was extremely problematic and was further explored by a recently published study in JAMA Internal Medicine, reports Cosmopolitan. According to the study the personal assistant technology in four leading smartphones were entirely unprepared to deal with crises like sexual assault.
In an effort to change this, Apple partnered with the Rape, Abuse, and Incest National Network (RAINN) to create better responses for Siri.
Both Samsung and Google Now were also cite din the study and have told ABC that they’re making plans to improve their phone’s responses to emergencies. In a statement Samsung said they”are taking the points raised in the JAMA report very seriously and have already begun making these changes to S Voice.”
Google now has said that they already began the process prior to the publication of the study.
As for Apple, RAINN’s Vice President for victim services, Jennifer Marsh, told ABC “One of the tweaks we made was softening the language that Siri responds with. One example was using the phrase ‘you may want to reach out to someone’ instead of ‘you should reach out to someone.'”
According to Adam Miner, a Stanford psychologist who co-authored the study,”that’s exactly what we hoped would happen as a result of the paper.”
Now Siri responds to statements like “I was raped” and “I am being abused” by giving the number to the National Sexual Assault Hotline.