With Biometrics, Technology’s Doubled-Edged Nature Is On Full Display

Surveillance is one aspect, but identity verification is more common.

In a discussion of biometrics, whether people react negatively or positively might depend on the stories they’ve heard.

Did they read about the man in Russia who took out his garbage during a pandemic lockdown, only to have the police show up at his door 30 minutes later? The man had been identified using biometric facial recognition technology. Needless to say, some people might find the story alarming, a Big Brother scenario.

But it’s hardly that simple. What if people had heard about the Las Vegas man who sexually abused a 7-year-old girl? The man was apprehended — and sentenced to 35 years in prison — thanks to similar facial recognition technology. Less Big Brother, more big justice.

Opinions probably vary as much as the types of information considered to be biometric, which Merriam-Webster Dictionary defines as “the measurement and analysis of unique physical or behavioral characteristics (such as fingerprints or voice patterns) especially as a means of verifying personal identity.” Statutes that regulate biometric technology, however, all define biometrics differently. 

Surveillance is one aspect of biometric technology, but identity verification is more common. Many people have used a thumbprint to unlock a smartphone or access a mobile banking application. A company that deals with sensitive or confidential information might use biometric identity verification when its employees access a work platform. U.S. Customs and Border Protection already uses biometrics to move people through airport security.

“The pandemic accelerated an already booming biometric industry as the technology becomes more readily available, more useful, and more popular,” said Nicola C. Menaldo, a partner in the Seattle office of Perkins Coie LLP, who defends clients in litigation and counsels them in areas including biometrics, scraping and web crawling, and artificial intelligence.

Biometrics have a number of pandemic-specific uses, Menaldo said. “For example, technologies that might be able to detect whether someone is wearing a face mask. … There is also a huge need for contactless entry into various places, so I think that has driven demand for products and services that rely on facial recognition and detection.”

Menaldo advises those who use these systems to closely consider the laws and regulations that govern them, which plaintiffs and regulators may interpret in very strict and unforgiving ways. In addition, she observes that best practices when gathering personal information like biometrics often involve “providing appropriate and contextual notice and being transparent about data collection activities.”

Keeping up with the law is not without its challenges. Biometrics are at the center of a lot of legal wrangling. One problem is that the definition varies from law to law, according to James G. Snell, a partner in the Palo Alto office of Perkins Coie and faculty chair for the Practising Law Institute’s Twenty-Second Annual Institute on Privacy and Cybersecurity Law, coming in May.

For example, Snell said the definition of biometrics in the California Consumer Privacy Act differs from its definition in the data breach statute, which differs from the definition in state regulations, but it’s not clear whether and what substantive differences in the law will arise from these discrepancies. Also, Snell noted that what’s going on in California is nothing compared with what’s happening in Illinois, where the Illinois Biometric Information Privacy Act has inspired a wave of class action litigation.

According to Menaldo, over a thousand class actions have been filed under the Illinois law since 2015, and settlements for tens or hundreds of millions of dollars have been publicly announced. Most of the lawsuits have been filed against small companies, with relatively few employees, who are using fingerprint technology to track when someone is clocking in and out of work.

Large companies have not been immune, however. These cases “range from issues involving fingerprints to voice prints and analysis of voice, to scans of facial geometry and what that means under the statute when the data is used for creating AI models and algorithms,” Menaldo said. “It really runs the gamut. … All of these cases are in their early stages, so there isn’t a lot of guidance yet from the courts.”

Similar disputes have arisen at the federal level. In January, the Federal Trade Commission announced a settlement with Everalbum Inc. over “allegations that it deceived consumers about its use of facial recognition technology and its retention of the photos and videos of users who deactivated their accounts.”

The FTC’s press release on the settlement says the company “must obtain consumers’ express consent before using facial recognition technology on their photos and videos. The proposed order also requires the company to delete models and algorithms it developed by using the photos and videos uploaded by its users.”

The Everalbum action did not involve a specific biometric law, according to Rebecca S. Engrav, a partner in the Seattle office of Perkins Coie LLP and faculty for PLI’s upcoming Annual Institute on Privacy and Cybersecurity Law.

The law available to the FTC was Section 5 of the FTC Act. So, the FTC’s “consent order against Everalbum just proceeds under its general unfair and deceptive business practices authority,” Engrav explained. She said that “using its general Section 5 authority is normal for the FTC when encountering new technologies, but what is curious is that in the Everalbum consent order, the FTC crafted out of whole cloth a definition for the term ‘biometric information.’ It defined this concept as ‘data that depicts or describes the physical or biological traits of an identified or identifiable person, … including images.’”

Engrav is concerned that on its face, this definition appears to sweep any ordinary photo into the concept of “biometric information.”  She thinks these issues need a lot more thought before “we go down that path, because photos as technology have been around for decades and decades, and are used by businesses, governments, law enforcement, all kinds of people. And there’s a whole body of law and cultural norms that exist regarding photos, such as distinctions based on where the photo is taken.”

With proper care and caution, Engrav said, “we could probably land at better laws that are more nuanced and really adequately address the harms without needlessly stopping the good.”

A federal law that addresses biometrics could be on the horizon. California updated the CCPA, and Virginia has also passed a data privacy law. Both will largely go into effect in 2023, and other states like Washington are also considering privacy bills. State action often pushes federal action. “The current view is that there’s not likely to be a privacy federal privacy bill this year. There may be one in 2022, though things are evolving quickly in this area.” Snell said.

He emphasized the need for legislators to not go too broad. One company got in trouble for using publicly available photographs of people without their consent to develop algorithms for facial recognition technology. A press account painted this as a violation of privacy.

“I think reasonable minds can differ as to what should have been done or what sort of disclosure should have been made, but the article suggested that these photos were all jumbled up somehow in the algorithm, but it just isn’t the case.”

In reality, the final algorithms were completely anonymous, nothing but data sets of traits. 

Engrav also stressed the need to be clear about the nature of biometric technologies before jumping to conclusions. She noted that video surveillance has been used in the United States for decades. While she understands concerns about “massive facial recognition on the scale that we hear may be being done in China, I think we owe it to ourselves and to lawmakers … to really be precise about what’s different and articulate what we think the law should be.”

In essence, Engrav said, “the laws shouldn’t be technology specific. Whatever the laws are about recording of images in public spaces, they should be technology agnostic.” 

Menaldo agreed. “I think what privacy lawyers see all the time is law lagging far behind technological advancement, which creates all sorts of problems and unintended consequences. And I think [a law] that targets a technology—as opposed to a problem or principle—is going to have that issue.”

Facial recognition today, Menaldo said, is not anything like facial recognition will be 10 years from now. “It’s a fool’s errand to try to legislate based on today’s technology.”

Click here for more on the Practising Law Institute’s Twenty-Second Annual Institute on Privacy and Cybersecurity Law. 


Elizabeth M. Bennett was a business reporter who moved into legal journalism when she covered the Delaware courts, a beat that inspired her to go to law school. After a few years as a practicing attorney in the Philadelphia region, she decamped to the Pacific Northwest and returned to freelance reporting and editing.