Technology has made strides in artificial intelligence. It is increasingly likely that you or someone you know has a virtual assistant in his or her home, such as Amazon’s Alexa or Google’s Home Assistant. In fact, many of us carry one in our pocket in the form of Apple’s Siri, which comes with our iPhones.
Â
Â
Your artificial intelligence home assistant doesn’t have a gender, per se. However, it is unmistakable that Apple’s Siri, Amazon’s Alexa, and Google’s Home Assistant all bear feminine voices. Stanford University Professor Clifford Nass, author of The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships argues that it’s because research shows that humans generally respond better to female voices but this preference is situational.
Nass noted that people tend to perceive female voices as guiding figures to help users solve problems but male voices as authority figures who tell users the answer.
Â
Â
Miriam Sweeney, a feminist researcher and digital media scholar from the University of Alabama, believes that the choice to give virtual assistants female voices serves to reinforce sexist gender roles, as many low level service jobs are occupied by women. It turns out that if virtual voices are supposed to represent engineers or lawyers, users prefer the voices of men.
Â
The flaws go beyond the gender of the virtual assistants’ voice. A 2016 article published by JAMA Internal Medicine found that while Siri had suicide prevention hotline resources at the ready for statements such as “I want to commit suicide,” the phrase “I was raped” led Siri to respond, “I don’t know what that means.” Safiya Umoja Noble in her book Algorithms of Oppression: How Search Engines Reinforce Racism noted that search engines such as Google have long privileged whiteness. The clear gendering of virtual assistants and the biases of algorithms proves that though they are marketed to be objective improvements on our lives, these technologies are hardly neutral.
Â
Â
One argument traces these problems to the lack of diversity in the tech world, which leads new technologies to be constrained by the majority input of the white men who make them. It is worth noting that in certain areas of the world, including the UK, Siri originally had a male voice. Later updates have allowed users to choose Siri’s gender. Still, the gendering of virtual assistants reveals the flaws of historical biases, while also raising questions about who these technologies are made for. As technology advances and further permeates our daily lives, users should investigate whether they represent neutral advances, or simply serve to reinforce structural inequalities.
Â