Is Alexa Biased? Study Reveals Gendered Design Choices

by

A recent study conducted by Dr. Lai-Tze Fan, a professor at the University of Waterloo, suggests that Amazon’s virtual assistant, Alexa, may have encoded technology that reinforces traditional gender roles. Dr. Fan analyzed hundreds of Alexa’s voice-driven skills to better understand how the design of the virtual assistant reflects and perpetuates feminized labor and social expectations.

The study aimed to demonstrate how Alexa’s design presents it as a female entity, leading to the integration of gendered labor and behavior expectations in the code and user experiences of various skills. While users do have the option to change the voice of virtual assistants like Alexa and Siri, research shows that male-presenting voices have not been as popular. Furthermore, gender-neutral voices have not been effectively integrated into the most popular interfaces.

Dr. Fan’s paper, titled “Reverse Engineering the Gendered Design of Amazon’s Alexa: Methods in Testing Closed-Source Code in Grey and Black Box Systems,” was published in Digital Humanities Quarterly as part of a special issue on critical code studies.

AI assistants like Alexa perform tasks through text or voice control, with the software searching for keywords to execute specific actions. Alexa currently offers over 100,000 skills that can assist with various tasks, such as cooking, cleaning, weather updates, music playback, and managing calendars. These tasks often resemble the support typically associated with personal assistants and domestic workers.

Although the code of popular virtual assistant technologies like Alexa is closed-source, Dr. Fan used reverse engineering techniques to identify features of the code architecture. By leveraging Amazon’s official software developer console, the Alexa Skills Kit, and platforms like GitHub, Dr. Fan accessed open samples and snippets of Amazon’s code. Additionally, she examined code from unofficial user-developed Alexa skills.

Through her research, Dr. Fan discovered code samples that showcased Alexa’s scripted responses to flirting and verbal abuse, as well as users’ attempts to manipulate the system into accepting misogynistic behavior.

Beyond the specific examination of Alexa, Dr. Fan’s work aims to analyze the culture of Big Tech and their presentation of data, information, and logic. It sheds light on the exclusionary, discriminatory, and unequal foundations upon which these technologies are built.

Understanding the design choices made in AI systems intended for care, assistance, and menial labor is crucial for comprehending how these choices may influence user behavior in virtual and real social contexts.

Dr. Fan’s research contributes to the interdisciplinary field of gendered design, encompassing science and technology, critical data studies, critical race studies, computer science, feminist technoscience, and related areas of study.

*Note:
1. Source: Coherent Market Insights, Public sources, Desk research
2. We have leveraged AI tools to mine information and compile it