The problem with Alexa: what’s the solution to sexist voice assistants?


If you have a smart speaker at home, you're probably interacting fairly regularly with an AI-enabled voice assistant – and you're probably talking to something that sounds like a woman.

Depending on the brand from which you bought your smart speaker, your voice assistant may even have been assigned a female or female name, such as Alexa, Cortana, or Siri. Sure, some of these language assistants can be configured to have a masculine voice, including Google Assistant and Siri, but most smart speaker users interact with virtual women.

At first glance this may not sound like a problem – but the equality of women and language assistants in society could have some worrying social consequences.

A female voice is the default setting for smart wizards, including Alexa in the Amazon Echo

(Credit: Juan Ci / Shutterstock.com)

In May 2019, a landmark report from UNESCO suggested that the standard use of female-sounding voice assistants in our smart-home devices and smartphones maintains a sexist attitude towards women.

The report titled "I'd Blush If I Could" takes its name from Siri's earlier standard response to the term "bitch" by users – and criticizes the fact that Apple's Siri, Amazon Alexa, Google Assistant, and Microsoft's Cortana are "exclusively female." or by default female, both in the name and in the sound of the voice ".

Sympathetic and pleasant

Why do language assistants sound like women? Julia Kanouse, Managing Director of Illinois Technology Association, explains that the companies behind these language assistants are guided by consumer feedback.

She explains: "Studies have shown that women's voices tend to be better received by consumers, and that from an early age we prefer to listen to women's voices."

In fact in an interview with business insiderDaniel Rausch, head of the Amazon Smart Home division, said his team had "researched and found that a woman's voice is more sympathetic".

So far, so plausible – and Kanouse admits that the use of female-sounding language assistants is clearly grounded in research.

Amazon Echo Dot

Studies have shown that we find female voices more sympathetic at a young age than male voices

(Image credits: Amazon)

However, the decisions made by speaking assistants could have far-reaching consequences for women at home and at work.

READ  Champions League 19/20 live stream: how to watch the football online from anywhere

"Using female language assistants can reinforce the stereotype that we'd rather tell a woman what to do than a man," says Kanouse.

"It's only recently that we've started to train men who have traditionally been seen as women, and vice versa, to see women fighting for these roles (such as flight attendants, nurses, legal assistants, officers) as more than just "an assistant will be viewed."

This progress, according to UNESCO, may be undermined by the spread of female language assistants. The report claims that the standard use of female-sounding voice assistants sends a signal to users that women are "obliging, docile and helpful assistants" available at the push of a button or with a blunt voice command such as "hey". or "OK" ".

It is also worrying that these language assistants "have no authority to decide beyond what the commander demands of them" and respond to requests "regardless of the user's tone of voice or hostility". These may be desirable features of an AI speech assistant, but what if the way we talk to Alexa and Siri ultimately influences the way we talk to women in our daily lives?

cheap google home mini deals prices sales

Researchers say that using predominantly female voices in smart speakers can lead to subconscious prejudice

(Image credits: Google)

One of UNESCO's main criticisms of companies like Amazon, Google, Apple and Microsoft is that the docility of our language assistants has the unintended effect of reinforcing "common gender bias that women are treated badly and ill-tolerated" ,

This obsequiousness is especially worrying when these female-sounding language assistants give "distracting, lackluster, or apologetic responses to verbal sexual harassment."

While not believing that this has led to overt cases of sexual discrimination, Kanouse believes that this leads to "unconscious bias," adding that "the prevalence of female language assistants can lead to unconscious prejudice against women at work and at home This makes it difficult for women to overcome these obstacles. "

READ  Huawei just revealed a new smartphone, and this one actually has Google apps

Should language assistants be gender neutral?

One solution could be to make language assistants gender neutral – and that's quite possible, as the creators of Q, the world's first gender neutral language assistant,

Speak with NPRJulia Carpenter, an expert in human behavior and emerging technologies who worked on the project, said one of the team's goals is "to contribute to a global conversation about gender, gender, technology and ethics and how to be inclusive People who identify in different ways ".

To create Q's voice, the team recorded "dozens of people," including those who identified themselves as male, female, transgender and non-binary, though in the end they chose only one voice and changed it until they did not Male still male sounded still female.

How Q sounds, you can see in the following video.

The result, while perhaps more synthetic than Alexa or Siri, is a truly comprehensive voice assistant for everyone – and the goal is to convince technology giants to adopt Q as the third option for their assistants.

Unfortunately, this is not likely. After all, brands like Apple, Google, and Amazon are known to be strict about the design of their products, and we can not see that they agree to use the same voice as their competitors.

Diversity is the key

Could the answer be to not make language assistants sound homogenous, but to make them extremely diverse?

This diversity does not have to be focused on gender either. Why can not our language assistants set regional accents? Why should not they sound young or old, use slang or pidgin English?

The message that the BBC is working on a language assistant named Beeb, which will understand all the different regional accents of the United Kingdom, has raised hopes that it will also speak with some of these accents.

Dr. Matthew Aylett, Chief Scientific Officer of a speech technology company CereProc. This could set Beeb apart from other language assistants in the market.

"No other organization can boast the resonance and meaning of the vote compared to the BBC," he explains, choosing a synthetic voice to represent the organization is "a big challenge."

READ  How to watch UFC 242: live stream Khabib vs Poirier (and the rest) from anywhere today

The relatively low number of women working in technical fields means that they have less influence on the design of language assistants

The relatively low number of women working in technical fields means that they have less influence on the design of language assistants

(Credit: Shutterstock.com)

Commenting on brands like Apple, Google, and Amazon, "In many cases, decision-makers choose a standard, neutral, well-spoken female voice without even considering that it's an important design decision."

And the BBC could be in the perfect position to challenge that. Aylett believes that using a diverse voice for Beeb "can lead to groundbreaking new perspectives on voice interaction" as it encourages the participation of a broad audience.

Aylett believes the BBC could even invite this audience to select popular BBC presenters and create a unified voice from the results – imagine how reassuring a David Attenborough / Joanna Lumley hybrid could be.

Aylett does not believe, however, that global Voice Assistant developers support the diversity of third-party providers such as the BBC or are "brave enough to offer much diversity themselves."

Why? Well, the teams behind our favorite language assistants are not that different.

Women to the front

According to UNESCO, Alexa's sexism problem is mainly due to the lack of women in the room when technology companies design their language assistants.

This is a problem affecting the whole industry as only 7% of ICT (Information and Communication Technology) patents are granted by women in the G20 countries. According to UNESCO, over-reliance on female-sounding language assistants is "an impressive example of the gender-based prejudices of technology that are ubiquitous in the technology sector and recognizable in digital skills education."

The solution? We need more women in the science, engineering, and maths sciences, and this requires, according to UNESCO, the recruitment, retention and promotion of women in the technology industry – how can our language assistants effectively represent their jobs? Users, when a large percentage of these users have no say in their development?

Whatever the answer is, it is clear that we need more choice when it comes to the voices of our intelligent speakers. As Kanouse says, "whether it's a male or a gender-neutral voice or an imitation shot of someone like Morgan Freeman, there are creative solutions that these companies could implement to make sure we do not have gender stereotypes strengthen. "

She adds, "Making this change could be a very strong statement from these influential companies."

"And would not it be fun to tell Morgan Freeman what he should do every day?"

Spread the good stuff:
This post contains affiliate links, to find out more information, please read our disclaimer.
The price written on this page is true as the time it is written. It may change at any moment.

Related Posts