In honour of International Women’s Day, the team at Noodle Factory would like to shed light on female chatbots in the male-dominated technology and AI space
Within the last decade, we’ve witnessed the creation of many female characters as virtual assistants, including Amazon’s Alexa and Microsoft Cortana, as well as feminine voices like Google Assistant and Apple Siri.
According to Brookings, approximately 92.4% of America’s smartphone assistants market share which includes Siri, Alexa, Cortana, and Google Assistant, use female-sounding voices. Although the virtual assistants have since been updated to offer male-sounding voice options, they were launched with a female-sounding voice which has remained the norm.
Have you ever wondered why this is the case? A common argument is that AI chatbots play roles that are commonly filled by women. Unsurprisingly, this is due to the gender stereotypes seen in real-life and the fact that there are more male developers than female ones.
So, what’s the problem? Some might think that there’s no harm in having helpful female-centric virtual assistants, right? Wrong - here’s why.
There was an analysis by Leah Fessler of Quartz in 2017 to see how Siri, Alexa, Cortana, and Google Assistant responded to lewd comments. The results? Their replies were vague, compliant, and even thankful in some instances.
Brookings believes that the existence of so many female-sounding voice assistants promotes stereotypes of submissive and compliant women. They brilliantly put this idea into perspective by saying that the virtual assistants’ responses can either normalise or criticise sexual and suggestive comments.
With the prominence of technology and the progressiveness of AI in our time, it is not surprising that virtual assistants can seem incredibly human-like, making users feel a connection to them. This is why we have to be extremely aware and careful so as to not perpetuate any gender stereotypes or biases.
Progressively, we are seeing positive changes in the way female virtual assistants are responding to harassment. Instead of their shrugs and ‘thanks’, they’re now telling users off and putting them in their places.
The shock factor of having chatbots stand up to societal inequality and gender bias can reinforce the idea that sexual harassment in all forms is inappropriate at all times. This is an immensely powerful socialisation tool that must be used to steer our society in a better direction.
What do we do next? While there isn’t an industry standard for humanising virtual assistants, here are some of our favourite tips: