When a voice-enabled device is connected to the cloud, it has access to a wealth of information. Some of that information, such as explicit songs, movies, or answers to questions, may not be suitable for children. With children’s imprecise speech, the voice assistant may accidentally bring up something inappropriate. A possible solution for this is to enable parental controls on the voice assistant to ensure that such material can’t be accessed by children.
There is also the concern of data collection and privacy for voice assistants specifically designed for children or ones that children are using. The Children’s Online Privacy Protection Act (COPPA) requires the Federal Trade Commission to issue and enforce regulations concerning online privacy for those under 13 years old. Companies will need to ensure that they are following this set of regulations. Communicating compliance to these regulations and letting your customers know that you are taking steps to protect the privacy and safety of their children will reassure parents while creating brand loyalty.
Cultural biases in voice user interfaces
When designing a voice user interface, it’s especially important to consider cultural biases, including racial, gender, accent, age, and regional. Creating an inclusive voice assistant will open the door to more users from more areas as well as promote the brand’s message of inclusivity and equality.
Regardless of whether your voice assistant is intended for global audiences or for specific regions, you’ll want to consider the level of respect or tone that the voice assistant uses as well as any other cultural norms or accents used by your intended audience. Some cultures are more comfortable with a casual tone while others expect more formality from their voice assistant.
Understanding accents or imprecise speech from children or the elderly is also important to ensure that the voice assistant is as accurate and responsive as possible for your users. Excluding specific accents or age groups could be taken as a sign of discrimination or simply as a message that those people aren’t an important part of your target audience. Inclusivity not only reflects well on your brand, but it also gives your company the opportunity to expand its user base.
Having a team of developers diverse in race, gender, age, and location can help overcome cultural biases in the voice assistant. Brands should also consider having a diverse range of people for user testing as well.
As voice assistants continue to become a part of our daily lives, responsible brands should consider ethical concerns when designing voice user interfaces, including privacy and data collection, suggestive language, child users, and cultural biases. Companies wanting to stay ahead of the game should plan for these before their users turn to voice-enabled competitors that are transparent with their users.
At SoundHound, we have all the tools and expertise needed to create custom voice assistants and a consistent brand voice. Explore SoundHound’s independent voice AI platform at SoundHound.com or speak with an expert or request a demo below.
Kristen is a content writer with a passion for storytelling and marketing. When she’s not writing, she’s hiking, reading, and spending time with her nieces and nephew.