Are \'Siri\' and \'Alexa\' Learning From Humans and Perpetuating Gender Stereotypes?

»

Are 'Siri' and 'Alexa' Learning From Humans and Perpetuating Gender Stereotypes?

Gender inequality which exists in everyday scenarios are so normalized we don't see them as problematic or sexist anymore, aren't limited itself to just humans. It's moved onto technology - and your personal digital assistants like Siri and Alexa.

Updated:May 22, 2019, 12:48 PM IST
Are 'Siri' and 'Alexa' Learning From Humans and Perpetuating Gender Stereotypes?
Gender inequality which exists in everyday scenarios are so normalized we don't see them as problematic or sexist anymore, aren't limited itself to just humans. It's moved onto technology - and your personal digital assistants like Siri and Alexa.
If you ask your Apple device, 'Hey Siri, can you make me a sandwich?' She doesn't respond with 'It's not my job to,' or 'Make it yourself.'

Her sly answer is something else: 'I can’t. I don’t have any condiments.'

Seems strange? How many women in real life do you hear say that? The reason behind it is simple. Although we hear the stereotypical women belonging in the kitchen, we perhaps don't really believe it as much (Or as I would like to believe). Women have full-time career opportunities, are advancing in every field, and there's more acceptance of women doing things other than her 'traditional gender roles' now than ever before. That said, we're far from achieving anything close to equality. And furthering the divide is gender stereotypes.

Gender stereotypes are so common and normalized that we often don't see them as problematic. And just like everything else, even technology is learning the human language. Now it seems even your personal digital assistants like Siri and Alexa have learnt gender stereotypes.

A research paper titled "I'd blush if I could: closing gender divides in digital skills through education" conducted by UNESCO in for the EQUALS Skills Coalition finds that our outlook on gender stereotype extends to our digital assistants.

In the piece called "The rise of Gendered AI and its troubling repercussions," the study addresses how we speak to our digital assistants in a gendered way. The study addresses three kinds of digital assistants - Voice assistants: Alexa, Google, Siri. Chatbots: automated AI replying. Virtual agents: Technology that communicates with users through VR or AR mediums.

The study asks the question why Siri, Alexa and Cortana are all female? To justify the decision to make voice assistants female, companies like Amazon, Apple has cited academic work demonstrating that 'people prefer a female voice to a male voice.'

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report states.

“Because the speech of most voice assistants is female, it sends a signal that women are ... docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

So, while the questions directed at the voice assistants may be sexist and further perpetuating stereotypes, the answers are equally problematic.

But hey, is it just female assistants? Google search is gender-neutral, (or gender-less because its an AI search portal) but people still ask it absurd questions. For example, one of the top questions searched on Google isn't an answer for a common query. Its "Will you marry me?"
Read full article