Opening Our Eyes to the Underlying Issues of Virtual Assistants

Veruna Lazar
3 min readOct 10, 2020

“Hey Alexa, find me the definition of feminist STS.”

I’ve never put too much thought into female computerized voices; from Siri, Alexa, Echo, to voicemail systems or business’ automated menus. A revolutionary tool, a device that uses human voice to guide, direct, and assist us through our days, took us by storm. I’ve come across these tools on numerous occasions and went on with my day. Though, something didn’t sit right, how come all automated voice systems use a default female voice? Are they problematic? The answer is, YES.

A woman answers us in a friendly and patient tone, fulfilling our every command. She gets us the daily weather report or the top news article of the day, she provides us with reminders and directions when needed. These virtual assistants embody the ideal image of the perfect female assistant. If these tasks were to be demanded or asked to be performed by another human, they would have sociological and psychological consequences. So, as Chandra Steele points out, one might think that using an AI, who cannot express any emotions, as a personal assistant “would erase concerns about outdated gender stereotypes.” But that’s just not the case.

By having all these computerized personal assistants using the voice of a female, it still decontextualizes and depoliticizes the historic reality of domestic service. Thao Phan, a feminist STS researcher specializing in gender and AI, explains how:

“The Echo romanticizes the relations of servitude in a way that denies the pain and historic context of that relation. It not only erases the bourgeois middle-class home as a site of exploitation, especially for women of color, but also misrepresents the power relation between the user and the device in such a way that obscures issues such as hierarchical surveillance and digital labor.”

Rebecca Zorach, the director of the Social Media Project at the University of Chicago’s Center for the Study of Gender and Sexuality, states that voiced technologies are about communication and relationships, these are “areas that women are presumed to be good at”. This creates a perfect illustration of how the binary of sex creates a division and helps with damaging ideas creating uneven responsibilities (in this case women being the ones to carry and/or being good at maintaining a relationship).

Not only do female voice-activated features and other female automated voice machines create issues in relation to stereotypical gender roles, as Phan argues, they also raise concerns with female sexualization. With Siri, for instance, Siri is described by using female pronouns and due to her asserted gender, it’s created a window for users to post on blogs and online forums sexually suggested questions to Siri such as “What are you wearing?” Siri, was programed to respond to sexual harassment in a tolerant, even coquettish manner and is supposed to be — alongside other virtual assistants — as obliging and docile. These assistants are programmed to tolerate and brush-off sexual harassment as, I think it is fair to say, a majority of woman come across in their day-to-day lives and that it is something that should just be accepted.

Whether or not these computerized voice machines were built with the intention of being sexist and accepting of stereotypical domestic service, they were created by companies with a bias. It is important that more people are taking notice at the underlying issues at hand with these devices because the more noise people make, the more pressured these companies become to make a change.

--

--