Wednesday, July 14, 2021

Apple's Siri is no longer a woman by default, but is this win for feminism?

 Siri has been criticized as embodying several facets of gender bias in artificial intelligence


As of March 31, 2021, when Apple released the iOs 14.5 beta update to its operating system, Siri no longer defaults to a female voice when using American English. Users must now choose between two male and two female voices when enabling the voice assistant. This move could be interpreted as a response to the backlash against the gender bias embodied by Siri

But how meaningful is this change really?

Siri has been criticized for embodying several facets of gender bias in artificial intelligence. Digital sociologists Yolande Strengers and Jenny Kennedy argue that Siri, along with other voice assistants such as Amazon Alexa and Google Home, has been developed in order to “carry out ‘wife work — domestic duties that have traditionally fallen on (human) wives.”

Siri was originally only voiced as female and programmed to not only perform “wifely” duties such as checking the weather or setting a morning alarm, but also to respond flirtatiously. The use of sexualized phrases by Siri has been extensively documented by hundreds of YouTube videos with titles such as “Things You Should NEVER Ask SIRI” (which has more than 18 million views).

Dated gender references

Apple has been criticized for promoting a sexualized and stereotypical image of women that negatively harms gender norms. A 2019 investigation by The Guardian reveals that Apple wrote internal guidelines in 2018 asking developers to have Siri deflect mentions of feminism and other “sensitive topics.” It’s not clear what the guidelines were for hard-coding flirty comebacks.

The language used by Siri was (and still is) a combination of an already stereotypical language model, including jokes hardcoded by developers. A 2016 analysis of popular language models used by software companies noted that word associations were highly stereotypical. In the study, terms such as philosopher and captain were gendered male, while the opposite was true for terms such as homemaker.

Legal scholar Céline Castets-Renard and I have been studying language models used by Google Translate and Microsoft Bing that have revealed similar issues. We input gender-neutral phrases in romanized Mandarin into the translation platforms, forcing the translation algorithms to select the gender in English and French. Without exception, the Google algorithm selected male and female pronouns along stereotypical gender lines. The Microsoft algorithm, conversely, exclusively selected male pronouns.

No comments:

Post a Comment