When People Think Siri Sounds Black, Biases Often Surface
Research shows that gender and racial bias can crop up even when people interact with completely artificial robot voices
For most of the past decade, digital assistants like Siri, Alexa, and Google Assistant had slightly robotic, white-sounding female voices by default. But recently, they’ve started sounding different.
Last year, Amazon’s Alexa got a male-sounding alternative voice for the first time. (That’s in addition to three Alexa voices that sound like specific celebrities.) This March, Apple gave Siri a new voice option that sounds neither traditionally masculine nor feminine. And in a previous software update, Apple released two voices that users were more likely to say sounded Black compared to the original voices, according to two surveys from a linguist at the University of Pennsylvania.
Users eager for broader representation in their everyday digital tools told CR they welcomed last year’s Siri additions. Several Black Siri users said they liked hearing a voice that sounded like a young Black person in the role of an all-knowing virtual assistant; it was validating.