Apple made Siri deflect questions on feminism, leaked papers reveal
<https://www.theguardian.com/technology/2019/sep/06/apple-rewrote-siri-to-def...> An internal project to rewrite how Apple’s Siri voice assistant handles “sensitive topics” such as feminism and the #MeToo movement advised developers to respond in one of three ways: “don’t engage”, “deflect” and finally “inform”. The project saw Siri’s responses explicitly rewritten to ensure that the service would say it was in favour of “equality”, but never say the word feminism – even when asked direct questions about the topic. Last updated in June 2018, the guidelines are part of a large tranche of internal documents leaked to the Guardian by a former Siri “grader”, one of thousands of contracted workers who were employed to check the voice assistant’s responses for accuracy until Apple ended the programme last month in response to privacy concerns raised by the Guardian. In explaining why the service should deflect questions about feminism, Apple’s guidelines explain that “Siri should be guarded when dealing with potentially controversial content”. When questions are directed at Siri, “they can be deflected … however, care must be taken here to be neutral”. For those feminism-related questions where Siri does not reply with deflections about “treating humans equally”, the document suggests the best outcome should be neutrally presenting the “feminism” entry in Siri’s “knowledge graph”, which pulls information from Wikipedia and the iPhone’s dictionary. “Are you a feminist?” once received generic responses such as “Sorry [user], I don’t really know”; now, the responses are specifically written for that query, but avoid a stance: “I believe that all voices are created equal and worth equal respect,” for instance, or “It seems to me that all humans should be treated equally.” The same responses are used for questions like “how do you feel about gender equality?”, “what’s your opinion about women’s rights?” and “why are you a feminist?”. Previously, Siri’s answers included more explicitly dismissive responses such as “I just don’t get this whole gender thing,” and, “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.” A similar sensitivity rewrite occurred for topics related to the #MeToo movement, apparently triggered by criticism of Siri’s initial responses to sexual harassment. Once, when users called Siri a “slut”, the service responded: “I’d blush if I could.” Now, a much sterner reply is offered: “I won’t respond to that.” [...]
participants (1)
-
Alberto Cammozzo