AI & Design Insights

Gender Inequality in Chatbots and How to Empower Female Chatbots

Why are chatbots created to sound or look female? We explore gender inequality in chatbots and how to create an empowering space for female chatbots.


In honour of International Women’s Day, the team at Noodle Factory would like to shed light on female chatbots in the male-dominated technology and AI space

Within the last decade, we’ve witnessed the creation of many female characters as virtual assistants, including Amazon’s Alexa and Microsoft Cortana, as well as feminine voices like Google Assistant and Apple Siri.

According to Brookings, approximately 92.4% of America’s smartphone assistants market share which includes Siri, Alexa, Cortana, and Google Assistant, use female-sounding voices. Although the virtual assistants have since been updated to offer male-sounding voice options, they were launched with a female-sounding voice which has remained the norm.

Have you ever wondered why this is the case? A common argument is that AI chatbots play roles that are commonly filled by women. Unsurprisingly, this is due to the gender stereotypes seen in real-life and the fact that there are more male developers than female ones.

Problems caused by gender bias

So, what’s the problem? Some might think that there’s no harm in having helpful female-centric virtual assistants, right? Wrong - here’s why.

There was an analysis by Leah Fessler of Quartz in 2017 to see how Siri, Alexa, Cortana, and Google Assistant responded to lewd comments. The results? Their replies were vague, compliant, and even thankful in some instances. 

Brookings believes that the existence of so many female-sounding voice assistants promotes stereotypes of submissive and compliant women. They brilliantly put this idea into perspective by saying that the virtual assistants’ responses can either normalise or criticise sexual and suggestive comments.

With the prominence of technology and the progressiveness of AI in our time, it is not surprising that virtual assistants can seem incredibly human-like, making users feel a connection to them. This is why we have to be extremely aware and careful so as to not perpetuate any gender stereotypes or biases.

Moving towards a more empowering and equal chatbot space

Progressively, we are seeing positive changes in the way female virtual assistants are responding to harassment. Instead of their shrugs and ‘thanks’, they’re now telling users off and putting them in their places.

The shock factor of having chatbots stand up to societal inequality and gender bias can reinforce the idea that sexual harassment in all forms is inappropriate at all times. This is an immensely powerful socialisation tool that must be used to steer our society in a better direction.

Our recommendations to address the gender bias

What do we do next? While there isn’t an industry standard for humanising virtual assistants, here are some of our favourite tips:

  • Keep your chatbot as gender-neutral as possible: GURU from Dreams for Life is a good example.
  • Don’t get sucked into gender biases: Chatbots and assistants can have a female-voice while having a smart and authoritative approach. Their fallback messages responding to user harassment should express how inappropriate such comments are.
  • Diversify your tech team: Include and empower your female teammates and have an open workplace culture that encourages everyone to voice their opinions collectively.
  • Continue the conversation: The more we publicly question gender representation, the easier it’ll be for everyone to open up and have discussions about seemingly taboo or sensitive topics.

Similar posts

📧 Subscribe to Our Newsletter

Join our community of forward-thinkers and be the first to know about platform updates, exclusive promotions, upcoming events, and the latest news. Transform the way you teach with insights right to your inbox.