Studies have long shown that men are more likely to interrupt, particularly when speaking with women. New research by Johns Hopkins engineers reveals that this behavior also extends to AI-powered voice assistants like Alexa and Siri, with men interrupting them almost twice as often as women do. These findings raise concerns about how voice assistant design—notably the use of stereotypically "feminine" traits like apologetic behavior and warmth—may reinforce gender biases, leading researchers to advocate for the design of more gender-neutral voiced tools.
Key Takeaways
- Male users' frequent interruptions of voice assistants like Siri in stereotypically feminine roles reflect and could reinforce real-life gendered power dynamics
- Gender-neutral voice assistants could promote more respectful human-to-human interactions
"Conversational voice assistants are frequently feminized through their friendly intonation, gendered names, and submissive behavior. As they become increasingly ubiquitous in our lives, the way we interact with them—and the biases that may unconsciously affect these interactions—can shape not only human-technology relationships but also real-world social dynamics between people," says study leader Amama Mahmood, a fifth-year PhD student in the Whiting School's Department of Computer Science.
Mahmood and adviser Chien-Ming Huang, an assistant professor of computer science and the director of the Intuitive Computing Laboratory, presented their findings on voice assistant gender and perception at the 27th ACM Conference on Computer-Supported Cooperative Work and Social Computing, held last fall in San José, Costa Rica.
In Mahmood and Huang's in-person study, 40 participants—19 men and 21 women—used a voice assistant simulation to complete an online shopping task. Unbeknownst to them, the assistant was pre-programmed to make specific mistakes, allowing the researchers to observe the participants' reactions. Participants interacted with three voice types—feminine, masculine, and gender-neutral—and the voice assistant responded to its errors by either offering a simple apology or monetary compensation.
"We examined how users perceived these agents, focusing on attributes like perceived warmth, competence, and user satisfaction with the error recovery," Mahmood says. "We also analyzed user behavior, observing their reactions, interruptions of the voice assistant, and if their gender played a role in how they responded."
The researchers observed clear stereotypes in how users perceived and interacted with the AI voice assistants. For instance, users associated greater competence with feminine-voiced assistants, likely reflecting underlying biases that link certain "supportive" skills with traditionally feminine roles. Users' own gender also influenced their behavior—male users interrupted the voice assistant more often during errors and responded more socially (smiling and nodding) to the feminine assistant than to the masculine one, suggesting a preference for feminine voice support.
However, working with a gender-neutral voice assistant that apologized for its mistakes reduced impolite interactions and interruptions—even though that voice was perceived as less warm and more "robotic" than its gendered counterparts. "This shows that designing virtual agents with neutral traits and carefully chosen error mitigation strategies—such as apologies—has the potential to foster more respectful and effective interactions," Mahmood says.
Mahmood and Huang plan to explore designing voice assistants that can detect biased behaviors and adjust in real time to reduce them, fostering fairer interactions. They also aim to include more nonbinary individuals in their research, as this group was underrepresented in their initial study pool.
"Thoughtful design—especially in how these agents portray gender—is essential to ensure effective user support without the promotion of harmful stereotypes. Ultimately, addressing these biases in the field of voice assistance and AI will help us create a more equitable digital and social environment," Mahmood says.