A person holding an iPhone looking at the screen for the new Siri voice settings
Users of iOS 14.5 can choose among four versions of an American accent or several accents meant to represent other English-speaking nationalities.

A couple of weeks ago, Jason Allen read that Siri, Apple’s virtual assistant, would be getting two new voices. So he tracked down a recording of them on Twitter.

“One stopped me in my tracks,” Allen says. “Did I just hear a Black male voice between those other voices?”

Allen, who is Black, had good reason to be surprised. For the decade that Siri has existed, its default voice has often been perceived as a white woman. The same goes for Siri’s peers, like Amazon Alexa and Google Assistant.

Allen, a public relations manager for an insurance company in the San Francisco Bay Area, rewound the clip, taken from the beta version of an upcoming iPhone software release. “You play it back and you hear someone who sounds like you, or a friend of yours, or someone in your family,” he says. He found the prospect exciting.

More on voice assistants

“A young Black voice owning that role in a lot of people’s homes is incredibly powerful,” Allen says. “It says that Black identity and African American identity have value, have legitimacy, and can be trusted here as a partner in searching for information.”

For many people, digital assistants like Siri or Alexa are daily guides to modern life. As they become increasingly powerful and appear in more devices—not just phones and tablets, but also smart speakers and cars—their voices have become a symbol of professionalism.

That’s why diversifying their voices is important, says Sherri Williams, a professor who studies race and media at American University. Today’s movies and TV shows are more likely than before to feature Black people in significant roles, according to UCLA’s annual Hollywood diversity report—but the same change hasn’t reached many of the voices that we hear, she says.

“When we think of racial representation, we usually think visually, not phonetically,” Williams says. Widely heard, trusted voices—everything from radio hosts to voice-over actors—have long been perceived as a particular subset of white people, and often male.

“When I think of Siri, I think of a voice that a lot of people hear, this omnipotent voice that’s also a voice of authority,” Williams says. “If we can make the voice of authority one that doesn’t always sound like white people, then that is progress.”

A Diversity of Voices

Apple hasn't commented publicly on who these new voices are meant to represent. In a statement, the company said only that the update is “a continuation of Apple’s longstanding commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in.” The voices are computer-generated, but each is based on recordings from a single voice actor; the company did not provide any details about them.

The new voices will be available to iPhone users when Apple releases iOS 14.5 next week. In the Siri section of the Settings app, the two additions will be labeled “Voice 2” and “Voice 3.” They will join the two existing American accented voices, and male- and female-sounding voices meant to represent other English-speaking accents, like Irish and Indian. 

I asked Nicole Holliday, a professor of sociolinguistics at the University of Pennsylvania, for help understanding how people might categorize the new voices. She quickly put together a survey to gauge people’s reactions, gathering a race- and gender-representative sample of respondents.

Of the 470 participants in her survey, a slim majority said that Voice 2 and Voice 3 sound like Black or multiracial speakers. By contrast, about three-fourths of participants said that Voice 1 and Voice 4, the old Siri voices, sound like white speakers. Consumer Reports generated the clips with an iPhone loaded with the beta version of iOS 14.5. Holliday provided the text, which is frequently used by linguists. (You can hear the passages in the clips below, which we generated separately for this article.)

The New Voices of Siri
Apple is releasing two new voices with iOS 14.5.
Voice 1
 
Voice 2
NEW
Voice 3
NEW
Voice 4
 

Brandon Pamplin, a systems administrator in Pittsburgh, downloaded a preview version of Apple’s software update and set up Siri with one of the new voices, which he describes as the voice of a Black woman. “It’s more satisfying to hear,” he said. “I grew up with my mom sounding more similar to this Siri than old Siri.”

“It’s not something I ever knew that I wanted,” says Pamplin, who didn't participate in the survey. “But when it showed up, it’s like—maybe I did want it.”

Virtual Assistants and Identity

From a linguistic perspective, it's not easy to put the new Siri voices in a clear racial category, Holliday says. Many tiny cues in a person's speech offer hints at their background. The new Siri voices have "some voice quality and intonational elements that are more likely to occur in the speech of African Americans," she says, but they're shared by some speakers of other groups.

For example, she says, the voices display more vocal fry—a creaky quality often heard at the ends of words—and a greater range of pitch than the old default voice. According to Holliday, those features are sometimes associated with African Americans, but they're also typical of younger speakers from varied backgrounds.

What's more, it's particularly difficult to assign a race or an accent to a synthetic voice. A person’s speech patterns are the product of all sorts of environmental factors that shape who they are as an individual, including their age and ethnicity, and where they grew up. Siri is ageless and has no parents or hometown. “We’re trying to impute an entire identity onto this voice that’s not a person,” Holliday says.

The new Siri voices are drawing some negative reactions in addition to the positive ones. A person said on Twitter that the new female-sounding voice “seems like Siri is trying to be cool.” Another said that both new voices “sound . . . um . . . a bit airheaded.” And someone asked: “Where are the older, more mature voices that actually sound like someone that has been formally educated? If I wanted to listen to unintelligible teenagers, I’d be on YouTube.”

Holliday says those commenters may be reflecting widespread stereotypes about younger, nonwhite speakers. But those attitudes could soften if technologies like Siri gradually “normalize” a variety of voices, she says, expanding people’s conception of who sounds competent. “If people hear me and say, ‘Oh, you sound like the new Siri,’ maybe that’s not a bad thing,” says Holliday, who is Black. “Our voice assistants should be able to represent the diversity of who we are.”

While Apple will now offer more diverse voices for Siri, the devices still have a way to go in understanding the voices of Black people in everyday life, says Halcyon Lawrence, a Towson University professor who studies speech interfaces. Like other experts I spoke with, Lawrence pointed to a 2020 Stanford study that found that major speech recognition products, including Apple's, misunderstood Black users at nearly twice the rate that they misunderstood white users.

"I am very concerned about representation (who we hear on these devices)," Lawrence wrote in an email. "I am equally concerned about perception (who these devices hear and who they discipline to speak in a particular way)."

Users who update to iOS 14.5 will be asked to choose which Siri they’d like to hear on their iPhone. It will be the first time since the assistant was introduced nearly a decade ago that Siri won’t default to its famous female-sounding voice.

Allen, the PR manager, says he’ll pick the new male-sounding voice. Paul Anthony Webb, a web designer in Silicon Valley, prefers the new female voice.

“Black people, brown people, and everybody else have always had to use these white-sounding voices, because they were the only option,” Webb says. “This is a welcome change. It’s not something I would expect any company to actually care about. Black people tend to be an afterthought.”

Editor's Note: This article has been updated with new recordings of Siri voices, and the inclusion of a statement provided to CR by Apple. It was originally published on April 22, 2021.