Women in Voice AI Weigh-In on Gender Bias for International Women’s Day 2021
Karen Scates
This year’s International Women’s Day theme is “Choose to Challenge.” In celebration of women everywhere, we asked the amazing women in voice AI to share their thoughts about gender biases in voice technology and in engineering roles.
Last year, we asked 5 female voice AI influencers to weigh in on the topic of “Each for Equal”and to talk about their personal journeys and experiences with gender stereotypes. The responses we got were both thoughtful and inspiring. This year, we wanted to take a deeper dive into gender equality and how the voices and personalities that populate our voice assistants either help us to overcome biases or further perpetuate them, and to explore the role diversity in engineering plays in those inherent biases.
We hope you join us and “Choose to Challenge” gender biases and the behaviors and decisions that contribute to them. Here’s what the inspiring women in the voice AI community had to say about the presence of gender bias and the actions we can take to create a more inclusive future:
Conversation Designer, Co-Author, Conversations with Things @revanhoe
“Biases in technology are proven and backed up with good science, and our industry is really awakening to this reality. Racial and gender biases surface in training data, in an AI’s personality, or in its general behavior.
Good companies are combating this with more inclusive data gathering and usability testing, which is a great start. But to truly solve this problem, organizations need to develop team cultures that allow people to safely explore their own biases; this is closer to the heart of the problem. And this cultural shift can’t be driven only by marginalized genders; men and cis-gendered people need to step up and share the responsibility to create inclusive experiences.”
“It’s hard to talk about gender bias in AI without talking about racial bias too. They go hand in hand, because the root cause is the same. AI is only as smart as the data that trains it. If the training set includes more white men than black women, technology will be worse at identifying black women… it’s not so complicated. (And today, most AI systems – whether facial or speech recognition – are much better for white men). One way to change this is to change the makeup of folks creating these inputs, so they can advocate for greater equity and diversity in the data. I’m optimistic that this talent shift is underway, but the work is just beginning.”
“The recent explosion of AI and voice technology companies has resulted in many exciting opportunities to work in this field, and impact the way future technology is built. It’s hard to accurately measure the gender skew in AI, but current statistics paint a picture where women make up less than 20% of the AI field, dropping further at senior levels. There are two important threads to addressing the gap, which are both equally important. The first is increasing the numbers of girls choosing to study STEM (Science, Technology, Engineering & Maths) subjects and follow STEM careers. The second is to better support women already in these careers so they can progress effectively to senior roles.”
Catherine takes a deeper dive into the topic in this article: “Tha AI Gender Gap”.
“Gender bias, an interesting topic. On one side we have the majority of voice assistants offering primarily female voices. On the other side, speech recognition falls short, performing worse for women (and non-white people). We need to build a world where everyone’s voice is heard clearly, and act with conscious intention so voice AI does not further perpetuate biased stereotypes—things that make one go, “hmmm.”
“Gender is a spectrum. But for many, the way it’s been defined has been the way it’s been socially constructed for years. And we’ve begun to see these social constructed ideas permeate through things like voice AI technology. There are assumptions sometimes that a more feminine voice should be used for things like assistance or cooking and more masculine sounding voices for things like law enforcement or sports and these assumptions can create certain biases and interactions with users.
At the end of the day, it’s important to have people of all genders including transgender people be a part of the initial conversation in creating any type of voice AI technology. The more we open the conversation to a mix of people during the R&D process, the less bias and more welcoming the voice AI tech world can be.”
“More diverse teams build more innovative products and are better for inclusive product creation. According to Goldman Sachs, companies in the top quartile for gender diversity are 33% more likely to outperform companies that are not as diverse. And among executive teams, gender-diverse teams beat all-male teams by 21%, while ethnically diverse teams outperform others by 33%, according to consulting firm McKinsey and Company.
The voice and conversational AI field are questioning how they put teams together. As people are working to build their teams and make effective products that serve their pain points, it’s a great time to consider who is and isn’t at the table.”
“There is so much interesting research on AI gender and how it currently reinforces dominant cultural biases. With so many cis white men still in these decision-making roles, particularly in leadership, we often wind up with tech and design decisions from a singular (privileged) perspective. That perspective leaves out considerations of possible trauma and harm because cis white men are rarely targets of things like hate speech, threats of violence, microaggressions, and so on.
What we need to combat this oversight is more non-male-gendered folk, particularly intersectional non-males, in positions of authority in tech, and for those people to be fully empowered to make decisions that will be backed up by the organization. And that last part is key: without company cultures backing up their employees with less systemic privilege, the tech actions and decisions will continue to default to that privileged perspective, and we’ll continue to include—and sometimes even promote—harm with the AI we’re putting out into the world.”‘
“As our interactions with technology become more human-like thanks to advancements in voice AI, designers, developers, and leaders have a greater responsibility to refrain from exploiting these connections. Voice technology gives brands a new social territory in which to exist and a new set of qualities by which to represent their brand like voice, tone, persona, and sonic branding. We can see gender bias in voice branding through the primarily female personification of virtual assistants.
I’m continually fascinated by voice AI’s possibilities and opportunities, but I also recognize its power to marginalize people at scale and magnify the harm of all bias, including gender bias. The only way to ensure we aren’t blind to the bias of corporations, institutions, and researchers, as well as their data and algorithms, is to build diverse teams backed by ethical principles to monitor and recognize bias.”
“Bias is rarely intentional. As I have worked with incredible engineers and data scientists who have created powerful machine learning models, it has not been that we didn’t care about gender bias. It’s that we didn’t ask the right questions to begin with. Increase your understanding of who will be impacted by your solution. When it comes to AI, your solutions will always impact more people that you intend. So set a bigger intention.”
When we challenge our assumptions and think about how the choices we make could impact the lives of others, we begin to emerge from thoughtless bias to a world where change is possible. For the voice AI community, the imperative is clear: challenge your data, your assumptions, and your processes. Create a gender equal world by celebrating women’s achievements and encouraging and promoting women in STEM. This year, take action for equality.
Karen Scates is a storyteller with a passion for helping others through content. Argentine tango, good books and great wine round out Karen’s interests.