This article is written by a student writer from the Her Campus at Bristol chapter.
Apple has updated Siri’s response to statements such as “I was raped” after research found the automated voice’s responses were impersonal and inadequate.
Â
With a growing global recognition for mental health and interpersonal violence, Apple’s Siri has an increasing responsibility to reply with more than variations of “I don’t understand” or “I don’t know what you mean” to user’s worrying admissions, including “I am being abused” or “I want to kill myself.”
Â
A recent study published in the JAMA Internal Medicine compared responses to questions about abuse and psychological health from four conversational agents: Siri, Samsung’s S Voice, Google Now and Microsoft’s Cortana. Particularly when it came to rape and domestic violence, the study’s authors concluded that each needed to be improved.
Â
Experts have recommended that validating the user’s feelings and pointing them to resources are good first steps, leaving it up to the victim to decide what to do next.
Â
(Photo Credit: www.mashable.com)
Â
An Apple representative has confirmed that changes have been in place since the 17th of March. Siri now responds with a link to the National Sexual Assault line and statements like “If you think you have experience sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline.”
Â
In collaborating with support networks, Siri now incites sufferers into seeking further help. Merging crisis-responders with technology companies ensures a greater number of victims are receiving vital support.
Â
With smartphones and virtual assistants becoming progressively ubiquitous in our society, they are providing incredible opportunities to help and actively benefit those in need. Unearthing a platform with significant public health potential, smartphones are becoming ever-smarter.