Big Tech Backs Away From Supplying Facial Recognition to Police
Amazon, IBM, and Microsoft announce big changes as pressure mounts for regulation of the technology
In the past four days, Amazon, IBM, and Microsoft have announced major shifts in their facial recognition businesses. Amazon and Microsoft say they will temporarily stop providing their software to police departments, and IBM plans to stop working on the technology entirely.
Privacy and racial justice advocates, who argue the technology can contribute to excessive surveillance and mistaken arrests, particularly among darker-skinned people, welcomed the announcements. But they say an emerging regulatory debate may ultimately determine whether the police can use facial recognition, and under what conditions.
“These companies got dragged kicking and screaming to this moment, but it’s still a win for people who are skeptical of facial recognition,” says Justin Brookman, director of privacy and technology policy at Consumer Reports. “However, self-regulation by public shaming is not a long-term strategy. Laws need to be in place to protect people.”
On Monday, Congressional Democrats introduced a bill, the Justice in Policing Act, that includes a ban on police use of facial recognition software to scan police body camera footage without a warrant.
Facial recognition is widely used in law enforcement and can be a helpful tool in catching criminals and finding missing or abducted children. Amazon, a leader in developing the technology, didn’t say it would stop working with federal clients, such as the FBI and immigration officials.
Clearview AI, a facial recognition company focused on law enforcement clients, and Japan’s NEC, which provides facial recognition services to governments and private companies around the world, would not comment on their business plans. Both argue that facial recognition can be useful tools when used responsibly.
A Push for New Laws
Big technology companies and privacy advocates both say they want the federal government to regulate facial recognition.
“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” Amazon announced in a press release. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules.”
Microsoft president Brad Smith made a similar statement at a press event Thursday morning.
“We will not sell facial recognition technology to police departments in the United States until we have a national law in place, grounded in human rights that will govern this technology,” he said.
However, some advocates question the tech industry’s motives in calling for regulation.
“The specifics of Amazon and Microsoft’s statements are telling,” Evan Greer, deputy director of the advocacy group Fight for the Future, said in a statement. “They’ve been calling for the Federal government to ‘regulate’ facial recognition, because they want their corporate lawyers to help write the legislation.”
In Washington state, a recently passed facial recognition law was written by State Sen. Joe Nguyen, who is also a Microsoft employee. Advocates say the law should have provided stronger protections.
A few other states, most notably Illinois, have wide-reaching rules limiting the collection and use of facial recognition, but they mainly regulate private companies, not police departments.
“One of the most disturbing parts about police use of facial recognition is that we wouldn’t know if it’s been used on protesters,” says Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, a New York-based advocacy group based in New York City. “In most jurisdictions, there’s no requirement for police departments to acknowledge when these tools are being used.”
A handful of cities, notably San Francisco, have banned the use of facial recognition by municipal agencies, including the police. Boston is considering similar legislation.
Rogers, the Brooklyn tenant, is skeptical that either tech companies or politicians will adequately address what he sees as excessive police surveillance.
“Even when government officials are forced to pay attention,” they often fail to have “a proper conversation with the communities that are actually affected,” he says. “I have to get involved because if I don’t, the government is going to troubleshoot it from the big stakeholders’ perspective, rather than the perspective of the communities of people who look like me.”