For the month of Pride, we wanted to discuss how non-binary voices can be used in voice assistants to reduce gender stereotypes and create a more inclusive community. We recognize that representation and inclusion are important considerations for brands that support people of all gender and sexual identities, and we wanted to extend the conversation to include the entities that represent those brands—their voice assistants.
For voice assistants, gender identities are represented through the voice persona and the sound of the voice, which are often found to promote negative gender stereotypes of women. In fact, 92.4% of the U.S. market share for smartphone assistants have traditionally female-sounding voices, according to Business Insider.
In an age where gender fluidity and non-binary identities are raising public awareness and creating new ways of talking about and to people, voice assistants continue to be assigned voices that associate them with traditional gender identities. Even though most voice assistants respond with statements such as, “I’m not a woman, I’m an AI,” when asked to identify themselves, users still recognize them as having female or male-sounding voices.
Non-binary voice assistants present a solution to the challenge of furthering gender biases and reinforcing gender stereotypes through interactions with voice assistants. A study published on Science Direct discusses how female stereotypes include nurturing and submission and stereotypes associated with men are dominance, autonomy, and achievement. While voice assistants are technically an “it”, not a he, she, or they, humans tend to use anthropomorphism—linking personality traits to non-human entities—especially with voice assistants where 41% of users consider them a friend or companion, according to Google.
While the wider conversation in the voice assistant space has centered on the prevalence of female-sounding voices and how it perpetuates gender stereotypes, non-binary voice assistants bring forth the question of whether gender needs to be assigned at all. Now more than ever, brands have a social responsibility to be at the forefront of change and positive action. When creating a voice assistant, the process isn’t just limited to how to make the fastest, most accurate voice AI possible, but also how to create one that promotes equality and equity. Possible solutions brands can consider when aiming to be more inclusive include using a wider range of genders and even non-binary voice AI.
The future of non-binary voice AI
According to a study by The Fawcett Society, 51% of people who experienced gender stereotypes as a child said it affected their careers and 44% said it harmed their personal relationships. According to eMarketer, 2.2 million kids—those ages 11 and younger—used a smart speaker at least once a month in 2020. While the negative consequences of gender stereotypes affect people regardless of age, brands should be especially aware of their young users and the impact their voice assistants may have on the sensibilities of the next generation.