Why are Bots Named after Females?

[1] With the evolution of AI and technology, a growing number of bots powered by AI are being introduced to the market. With the introduction of these bots, which act primarily as virtual assistants, comes a need to create a persona -  a friendly, relatable presence as opposed to an eerie, mechanical intruder. With Amazon and Microsoft opting for the female characters of Alexa and Cortana to respectively represent their bots, alongside the feminine voices of Apple’s Siri and Google’s Assistant, we see a theme occurring in the industry when assigning a gender to virtual assistant bots. Why is it that when designing this AI, we are seeing so many bots introduced to the market under female names?

The most common argument is that these bots are aligning with roles more commonly filled by women. Ever noticed how we are met with a female voice when we reach voicemail? Or how most directions are given by a feminine voice on your GPS? These virtual assistants are designed to be at our beck and call, to assist us in everyday tasks such as: scheduling events, helping you make that dinner reservation last minute, or getting that chocolate cake recipe you’ve been craving. The role of these bots is generally to remind us of something or research quickly when we can’t be bothered to do the heavy lifting / graft ourselves, arguably performing duties which are generally considered as secretarial and administrative. These roles are frequently assigned to women. In fact, 95% of secretarial and administrative assistant roles in the US are filled by women[2]  – is it any surprise that we align the gender of bots performing similar duties to match? Should we expect bots to be gendered otherwise? 


Quite frankly, yes. We should. Perpetuating these gender stereotypes in developing technology and having it regurgitated back into the environment around us reinforces archaic gender roles. Some would even argue that, at its extreme, it reinforces the notion that the role of a female is to ‘be ordered around, to be subservient and whose place is in the home.’[3]  Just because assistant positions in the workplace are roles primarily filled by women, doesn’t mean that they should be, nor that discrimination hasn’t dictated these gender stereotypes in the first place. It wouldn’t be the most outlandish argument to point a finger at sexism and call out the barefaced role it is playing here. Virtual assistants are fundamentally being designed with female names and voices in line with more common gender constructs and stereotypes which we have repeatedly undermined. If mirroring this discrimination and reflecting it onto products instead of challenging gender stereotypes isn’t black and white sexism, what is?


 
Yet, can we really blame this trend on designers alone? Leading us on to a more challenging perspective – that female robot naming is not an act of robot makers being sexist– rather enforcing this gender identity upon a robot in this role is in response to our perceptions as a market. They are delivering a product to us in a form that we expect, that we will familiarise with. Bots are fundamentally products which are targeted to the market in a responsive way; companies give us what we want – even if we don’t know we want it. Even given the choice of a female or male voiced virtual assistant, market research has shown a preference for female voiced products.[4]  This underpins the argument that unconscious bias is playing a role in perpetuating this gender stereotype and is influencing the market. We trust Alexa to take notes and Cortana to answer us as we are used to females in this position. We are fundamentally creatures of habit. And engineers know that. To play devil’s advocate, how do you believe the sales of a female bot giving sports advice or football tips would perform? How would the public respond to a female bot giving programming advice, or to a male bot giving makeup tips? Would we feel entirely comfortable, or is it engrained within human behaviour to have this constructed preference? How tied into our subconscious are these gender roles? 


 
Even then, we could argue that we are just listening to and living alongside a robot – it’s not a human. So, why does it matter? 


Robots are the first step to the AI revolution. It is undeniable that the future holds more advanced robots, more physical manifestations of artificial intelligence who will exist beside us. If we are to really fight the gender constructs which have haunted the human race for centuries, why stop before the limits of our creations? Just as we attempt to sidestep the influence of discrimination (racism, sexism, fascism) in the artificial intelligence we develop for these bots (although that didn’t go too well in the case of Microsoft’s / Twitter’s racist AI),[5] the physical implications of robots need to follow suit. We have in our hands the power to influence and shape a whole industry and project our technology into the future – we need to be careful to make it progressive and to not tie it into pre-existing cultural and gender stereotypes. This may sound like a stretch from the debate of ‘just giving assistant bots female names’, however (without going too much into how far human ethics should apply to AI) we should begin the stepping stone of this venture without the constructs which we have fought so long and are still struggling against. Assigning female names to bots with administrative and secretarial duties is contradictory to this outset as it intrinsically ties them in with gender roles and constructs that have been repeatedly criticised and are being challenged.


Ultimately, it boils down to the following: How far should human ethics apply to robots? Should we treat bots as mere products that we tailor towards market desires, or as the first step towards true artificial intelligence? And if we continue to market them as the former, will it be too late to turn around and redesign them to be less tied into human gender roles?

[1] https://www.wsj.com/articles/alexa-siri-cortana-the-problem-with-all-female-digital-assistants-1487709068
[2] https://www.bostonglobe.com/metro/2017/03/06/chart-the-percentage-women-and-men-each-profession/GBX22YsWl0XaeHghwXfE4H/story.html
[3] http://fortune.com/2018/05/10/ai-artificial-intelligence-sexism-amazon-alexa-google/
[4] https://www.wsj.com/articles/alexa-siri-cortana-the-problem-with-all-female-digital-assistants-1487709068
[5] https://www.independent.co.uk/life-style/gadgets-and-tech/news/tay-tweets-microsoft-ai-chatbot-posts-racist-messages-about-loving-hitler-and-hating-jews-a6949926.html

by Karina Gorasia

Posted on October 11, 2018