Are AI Voice Assistants Reinforcing Gender Stereotypes?
Logo design of Opporture, an AI company with color alternatives.

Are AI-Voice Assistants Reinforcing Ugly Gender Biases: A Contemplative Analysis

“Alexa, what will the weather be like this afternoon?” “Siri, find me a recipe for low-calorie chia pudding.” “Alexa, find me the best dentist in town.”

This is how many of us across North America start our day- by talking to our AI-powered voice assistants.

Alexa, Siri, Google Assistant, and Cortana are everyone’s best friends if you own an AI-powered device. These voice assistants have been around for years ( Siri since 2011 and Alexa since 2014), and we’ve asked them all sorts of questions- from the most important and informative to the most mundane. But have you noticed one thing: All AI assistants are voiced by women, and all have a feminine name? Why aren’t AI assistants answering in men’s voices or having a male name? Have you ever thought about this?

As evident as it is, the female gendering of AI technology is used by all major companies. We are used to having Alexa or Siri talk to us in a soft, soothing voice and answering our queries.

The burning question is, are we restricting AI assistants to a particular gender? Maybe we are, but why?

Gender Bias in AI Voice Assistants: What’s Happening?

While some people may want to end this argument by saying that female voices sound better, the issue is far more deep-rooted. Companies servicing customers in the USA, such as Amazon, Apple, and Google, have faced intense backlash for instigating gender bias by using feminine voices and names for their AI assistants.

And the reasons are amply justifiable:

  • Ever since man can remember, women have been tagged as the “weaker sex.” They are expected to be subservient and tolerant to degrading treatment and verbal and physical abuse. In a way, smart devices with feminine voices demonstrate these long-prevalent gender biases.
  • Using gender-biased voices and names in AI can exacerbate women-related violence, hyper-sexualization, and objectification.
  • AI bots are engineered to answer, by default, in a subservient, pleasing, obliging manner, irrespective of whether the tone is appropriate for the answer. Studies have revealed that the prominence of feminine AI voices showcases women as compliant and passive, thereby perpetrating misogynistic and abusive behavior from users.
  • AI voice assistants are primarily used for domestic and administrative tasks like setting an alarm, making important payments, or remembering dates. Even human robots are being built for specific customized roles like bartenders, waiters, etc. It is no coincidence that society relegates these tasks and professions to women.

This bias only reflects what already exists in society, and it must be addressed because these technologies are here to stay.

Well, who would have thought that our penchant for hands & eyes-free human-computer interaction would open up a hornet’s nest about gender bias and the influence of AI in our daily lives?

How did it all come to this?

Have We Personified AI-Powered Machines?

The technological practice of using voice assistants dates back over half a century. It was the time when engineers were trying to make machines learn how to understand and process human speech. Hence we cannot simply point fingers at Amazon, Google, or Apple.

Popular among those inventions were:

  • Phone dialler Audrey invented in 1952
  • Calculator Shoebox invented in 1962
  • Carnegie Mellon’s Harpy, a vocabulary machine designed in 1976
  • Dragon Dictate’s Naturally Speaking software was created in 1997

Apple introduced the world of Siri in 2011, after which followed Alexa, Google Assistant, Cortana, and many others. Of course, these AI-powered voice assistants took the digital world by storm, with adults and children gaga over them. It opened a whole new, exciting interaction with a machine and fostered a “relationship” with them that was, until then, quite impossible.

So, we’ve assigned these AI assistants with a voice like how we have animated pictures, robots, and conversational AIs, to make them seem or sound more human. By doing so, we have initiated the personification process.

Today, AI voice assistants have firmly embedded themselves in our society. It’s remarkable how their technological capabilities have increased almost ten years after their introduction. And the story will not end here. There will be a significant increase in voice-based AI integration. According to Juniper Research, there will be more than 8 billion voice assistants by 2024, which is less than a year away.

The burgeoning number of AI voice assistants will radically change how we interact and perceive them. In a nutshell, the personification process that we initiated is now unstoppable, primarily because we gave the machine a voice.

This is where the gender issue comes into play. Will we want to define the future by artificial female servitude? Not at all. There’s no reason why we can’t use male voices for AI assistants alongside female ones.

Before heading into the future, let’s return to the past to understand why feminine voices were the first choice. Did it really stem from gender-biased notions?

How & Why Did Feminine Voices Become the De Facto Standard?

In North America, when Amazon, Apple, Google, and Microsoft launched their AI-voice assistants with a female voice and name, there was, apparently, a lot of excitement. Despite these companies creating masculine voice options, the default firmly remains feminine. For this, we can either award credit or blatantly blame the lack of diversity in the tech industry.

There just haven’t been that many masculine voices for centuries!

Let’s go all the way back to 1878 when Emma Nutt was appointed as the first woman telephone operator. In the years that followed, many more women joined the list, and soon the industry was dominated by women. The result is more than a century of archived women’s telephone conversations which can be used for creating and training new AI voice automation.

In due course of time, what once served as a traditional choice, became more of a matter of convenience. Apart from this, there isn’t any other tangible advantage to prioritizing female voices over male voices. Both are equally capable of conveying information with the right tone of voice and diction.

Blame it on Biology!

In another popular theory, Biology has been implicated in the overrepresentation of female voices in AI virtual assistants. There’s ample evidence that people prefer listening to a woman’s voice to a man’s simply because it sounds nicer.

Another belief is that women’s articulation of vowels and pronunciation is better and easier to hear when using small speakers over background noise. But this theory has already been debunked, and hence it’s no use raising a hue and cry over it.

Consumer preferences & cultural perceptions.

The creators or organizations behind AI voice assistants have a different argument. According to their market research, consumers prefer instructions in women’s voices. So, naturally, the consumers get what they want. But why do consumers prefer women’s voices? Well, such consumer preferences stem from deeply ingrained cultural notions about women and their domestic and societal roles. In domestic settings, in particular, women are predominantly the caregivers and assistance providers.

We can list a dozen more proven reasons for feminizing AI voice assistants. However, on the grim side, research indicates their existence perpetuates and reinforces harmful gender stereotypes. The negative aspects of these stereotypes are many, such as:

  • Perpetuating women as being subservient to men.
  • Shaping people’s attitudes about a person or a situation.
  • Eliciting gender-biased behavior amongst users.
  • Portraying women as docile, eager-to-please helpers.
  • Wrongly signaling that women can be ordered around or addressed curtly.

These AI systems can reinforce such negative stereotypes because of their responses since most are either submissive or sexualized. It may create a toxic culture where women are wrongly portrayed.

Let’s Say Yes to Neutral-Sounding AI Assistants, Going Forward!

One of the biggest reasons AI voice assistants are not yet neutral-sounding is that they are created by humans and influenced by societal biases. However, it is time to step up and address this issue before it goes out of hand.

Small and prominent players in the tech industry have already started correcting the situation. According to Upegui, “Q” in 2021, has introduced the world’s first genderless AI voice in 2021. This voice will be a landmark in the fight against misogyny in the digital world.

Although Siri and Google Assistant will remain female, their creators Apple and Google,
are seriously considering adding male voices to their AI rosters.

But let’s be realistic here: It will take more than mere setting changes to overthrow the misogynistic gender biases in these tech systems. A societal shift is what we need. But society will point fingers at the digital world and blame it for its bad influence. Hence, it’s better to initiate the change right from giving a voice to AI assistants.

Opporture is a reputed AI model training company in North America, having helped many US clients with AI-powered content-related services. Give us a call to discuss how we can help you with our technologies. Call today!

Recent Posts

Copyright © 2023 opporture. All rights reserved | HTML Sitemap

Scroll to Top
Get Started Today