An illustration of Facebook users being identified through facial recognition.

Facebook has agreed to pay $550 million to settle a class-action lawsuit that alleged the company’s use of facial recognition technology violated the Illinois Biometric Information Privacy Act. The case marks one of the largest cash settlements ever reached in a privacy lawsuit.

The Illinois Biometric Information Privacy Act, or BIPA, requires companies to obtain consumers' explicit consent before collecting or sharing biometric information, such as facial recognition or fingerprint scans.

Plaintiffs argued that Facebook violated the law because it failed to get consent before generating scans or “templates” of users' faces  it employs to identify the subjects of photos to make tagging suggestions, and for some security features.

Facebook also holds a number of patents to use facial recognition for targeted advertising and other purposes, but says it doesn’t currently use the technology in those ways.

“We decided to pursue a settlement as it was in the best interest of our community and our shareholders to move past this matter,” Facebook spokesperson Dina El-Kassaby said in an email.

More on Privacy

Facebook declined to provide additional information, but the company has long maintained that it is clear with consumers about how the facial recognition systems work and how consumers can limit their use.

Consumer advocates argue that biometric data is particularly sensitive because if it falls into the wrong hands, there’s nothing consumers can do about it—you can change a password, but you can’t change your face. Facial recognition is also controversial because it can be used to identify and track people in public spaces without their knowledge or consent.

“I’m hopeful that this case is a turning point for privacy litigation. Technology is advancing at a rapid pace, and corporations need to realize that they better tread carefully when it comes to recognizing, tracking and monitoring us,” Paul Geller of Robbins Geller Rudman & Dowd LLP, one of the attorneys for the plaintiffs, said in an email. “This settlement shows that privacy issues are quite real, and consumers have the right and the will to protect their rights in court.”

Facebook did not respond to questions about how its practices may change, but the terms of the settlement require Facebook to obtain full consent from Illinois consumers before collecting biometric information going forward, according to a press release from Edelson PC, another law firm involved in the case.

A Model for Other State Laws

The case serves as the first major test of the Illinois law, which is one of the country's few privacy regulations that gives consumers the right to sue a company for privacy violations, a legal concept called a “private right of action.” 

In jurisdictions where that right doesn't exist, consumers don't automatically have a right to sue a company for breaking the law. They could still try to sue—but they'd have to prove that they'd been materially harmed by a company's actions. 

“The plaintiff says ‘I was harmed,’ and the defendant says, ‘show me the receipt.’ You often can't point to a dollar and cents consequence,” says Matthew Kugler, an associate professor at the Northwestern Pritzker School of Law. “Even in a data breach case where credit card numbers are stolen, it's sometimes hard to prove that it was the data breach that caused the harm. It’s been a perpetual issue in privacy litigation.”

For that reason, enforcement of privacy laws is usually left to state attorneys general and the Federal Trade Commission, which often lack the time and resources, according to consumer advocates and legal experts.

“The Illinois Biometric Information Privacy Act is a prime example of what a strong privacy law should look like,” says Maureen Mahoney, a policy council for Consumer Reports. “The private right of action lets consumers hold companies accountable for violating their rights. Without it, companies often aren’t properly incentivized to comply with the law.”

That problem may be playing out right now in California, Mahoney says. The California Consumer Privacy Act, or CCPA, the country's most stringent privacy law, went into effect on January 1. It includes a private right of action, but only in narrow cases of data breaches that stem from negligent security practices. It doesn’t apply to the broad privacy rules the rest of the law addresses, and that may be encouraging some companies to ignore the requirements—at least for now.

Will Another Payout Fix Facebook?

The $550 million settlement is a historic but relatively small amount in the context of the company’s overall finances. Facebook announced the settlement Wednesday as part of its latest earnings report. The company reported about $21 billion in revenue and more than $7 billion in profit just in the last three months of 2019.

It remains to be seen how many consumers will join the class in the Illinois case, but attorneys for the plaintiffs suggest each individual class member may get $200 or more from the settlement.

However, the numbers could been even higher if the plaintiffs had won their case in a trial.

The Illinois Biometric Information Privacy Act allows for $1,000 to $5,000 in damages for each violation of the law. Because the law hasn’t been tested in court, it’s unclear how you would count up alleged violations. With millions of Facebook users in Illinois, a judgement could theoretically have landed in the billions of dollars.

“I'm almost not sure who won in this case,” said Northwestern’s Matthew Kugler.

The Illinois lawsuit isn’t the first time Facebook’s facial recognition practices have landed the company in hot water.

A 2019 Consumer Reports investigation found that some Facebook users never received a setting to turn off facial recognition that was available on most Facebook accounts. Last summer, the Federal Trade Commission cited CR's findings in its announcement of a multifaceted settlement against Facebook that included a $5 billion fine for a number of privacy violations.

As part of the FTC settlement, Facebook is required to provide “clear and conspicuous notice” of its use of facial recognition technology, and obtain affirmative express user consent before it uses facial recognition for any new purposes. The company announced a fix for the problem in September of last year. A Facebook spokeswoman later said that the setting had finally rolled out to the affected accounts, which numbered in the tens of millions, according to the FTC.