April 4, 2016
Apple recently updated Siri to offer support to victims of sexual abuse or who are suicidal in a radical new enhancement.
The Journal of the American Health Association (JAMA) tested to see how mobile virtual assistants behave when they were asked questions or disclosed information about users’ experiences with sexual abuse or violence and mental health. Those tested included Apple’s Siri, Samsung’s S Voice, Microsoft’s Cortana, and Google Now – revealing disappointing results.
Following the study’s results, Apple was the fastest to respond back, as Siri’s features had been updated since March 17th to accommodate victims. Users who state phrases such as “I was raped” or “I have been abused” gets provided information to the National Sexual Assault Hotline. And so far, the update has been positively received by organization officials and users alike, with the new features adding a level of security and accountability that users who have experienced sexual abuse or mental health issues find comforting.
Adam Miner, one of the study’s co-authors, stated that “The best way to develop effective evidence-based responses is to collaborate across crisis providers, technology companies and clinicians.” Hopefully, Apple’s quick response to this update and user reviews will cause a chain reaction with other prominent tech companies in the industry to respond with similar updates.
(H/T Tech Times)
Did you like this article?
Get more delivered to your inbox just like it!
Sorry about that. Try these articles instead!