The Biometric Battle Goes to Court

As states enact more data protection requirements, business risks grow

August 29, 2019 Photo

The use of our biometric data is becoming such a familiar part of our everyday lives that we no longer think twice about using fingerprint or facial recognition to access our phones. And while most are probably aware of the possibility that face-scanning is used for security purposes any time they go to a mall, concert, or airport, they probably do not focus too much on whether and when it’s happening as they go about their daily lives. Like any new technology, the use of biometrics for identification or security purposes has raised a host of legal and ethical issues. A few states have even begun to respond with biometric privacy laws, and as the stakes get greater, it is likely that many more will follow suit.

Biometric Identifiers and Uses

Biometric identifiers include any and all characteristics that are physiological (related to the shape of the body), rather than behavioral. You’ve likely encountered one or more biometric security protocols without thinking too much about it; the most common biometric identifiers include fingerprints, face recognition, DNA, palm prints, iris recognition, and retinas. Other less-recognized characteristics include the shape of the ear, body odors, veins in hands, facial contortions, and the way someone sits and walks.

Biometric surveillance and security practices are becoming fairly common at the borders of countries, at airports, in prisons, and at other governmental agencies. Some forms of biometric identification are also regularly used by law enforcement and as a means of physical access control. These practices are becoming more widely used in the private sector as well, which utilizes some of the more advanced forms of the technology. Some examples include voice recognition (currently used by Citibank), heartbeat monitors (the British bank Halifax is working with these), use in cars (Ford is a good example), and passport chips. With each passing day, organizations are finding new and original ways to use the data our own bodies automatically provide as a means of identification.

Legal and Ethical Issues

There are many obvious advantages when it comes to predicating data security on the characteristics of each unique human body. Biometric systems are not susceptible to compromise in the same ways as traditional, arbitrarily created passwords (that aren’t that arbitrary after all), or secret codes or keys that often are decrypted. Rather, because of its accuracy and uniqueness, biometric information is viewed as the most efficient and infallible lock for the things we need to keep safe—and safe forever, even if we forget the name of our first pet or childhood best friend (obviously, by now, it’s safe to assume that everyone knows your mother’s maiden name).

But biometric technologies are, like any other technology, susceptible to error and compromise. Biometric technologies essentially are a reconciliation of physiological and behavioral characteristics operating in a probability calculation; the saved biometric information results from an algorithm generated during the previous registration. A biometric match does not represent certain recognition, but rather a probability of correct recognition, while a nonmatch represents a probability, rather than a definitive conclusion, that the individual does not hold the biometric keys to the lock.

It is inevitable that even the best-designed biometric system will be incorrect or indeterminate, with both false matches and false nonmatches occurring. The risk of the system allowing access to an unregistered person (“false acceptance rate”) or denying access to a registered person (“false rejection rate”) can be significant to an organization’s operations, data security, and integrity in general. Accordingly, even though the vulnerability of biometric systems is considered to be rather low, any compromise or manipulation of biometric data may have serious consequences. Unlike conventional passwords or security codes, one cannot change biometric data every 90 days. Once biometric data is accessed by a wrongful actor, that unique information may never again be private.

The implications of unauthorized access to, manipulation of, and ultimate uses of the biometric data of millions of people are incredible. Biometric data already is freely given when individuals use their fingerprints for access to mobile devices, use internet-of-things devices, interact on social media, or even when they clock in at work as they endeavor to earn an honest living or become a member of the state bar. While many of us consciously use such systems to make daily life easier and more efficient, governmental or private enterprises’ widespread collection, use, and analysis of millions of individuals’ biometric data largely has been criticized as overbroad and without sufficient transparency. For example, during Super Bowl XXXV in January 2001, the faces of 100,000 spectators were scanned and compared with pictures of suspected terrorists and criminals by using biometric face recognition. Perhaps even more concerning, ISIS already has used stolen biometric data in the form of counterfeit fingerprints for financial transactions.

People are starting to take notice and take action. As recently as May 2019, San Francisco became the first U.S. city to ban its government from using facial recognition technology. This development was particularly notable because the ban was put into place before the technology was ever actually used by San Francisco agencies, perhaps signaling the start of proactive rather than reactive regulation.

Biometric Privacy Laws and Litigation

The earliest example of a biometric privacy law is the Biometric Information Privacy Act (BIPA), which was passed in Illinois in 2008. Notably, BIPA does not actually prohibit the use or collection of biometric data, but rather establishes a series of requirements meant to protect that data. First, BIPA requires private entities that collect biometric information to inform individuals about why they are collecting the data and for how long they are holding it. Then they must obtain a written release from the individual to collect such data. Second, the act states that biometric data cannot be sold, traded, leased, or used for profit in any manner. Third, BIPA states that biometric data cannot be shared without the individual’s consent unless it is required by law. Lastly, and most importantly, the act creates a private right of action that allows an employee to sue for violations of the statute. BIPA allows an employee to sue for $5,000 for each willful violation, and $1,000 for each negligent violation of the act.

BIPA has served as an example to other states considering biometric privacy legislation. Recent Illinois Court decisions, however, have fueled a nationwide debate about the inclusion of the private right of action in this legislation. Specifically, it was once thought that an individual must show actual harm in order to have standing under BIPA. The Illinois Supreme Court, however, unanimously disagreed.

In Rosenbach v. Six Flags Entertainment Corp., a mother sued on behalf of her 14-year-old son, alleging violations of BIPA. Her son was asked to scan his thumb in order to receive a season pass at the Gurnee Six Flags location, and neither the boy nor his mother was told why their data was being collected and for how long it would be stored. The mother argued that it was this lack of information regarding the use of her and her son’s fingerprints that violated the act. Six Flags argued that it could not be held liable unless the plaintiff could demonstrate an actual injury or adverse effect beyond mere violation of their rights under the statute.

However, the Illinois Supreme Court unanimously held that a person need not have sustained actual damages, and that a private entity that collects biometric information could be found liable for mere violation of the statute alone. The court reasoned, “Whatever expense a business might incur to meet the law’s requirements…are likely to be insignificant compared to the substantial and irreversible harm that could result if biometric identifiers and information are not properly safeguarded.”

The Illinois Appellate Court reiterated the Rosenbach holding when a creative defendant attempted to narrow the application. In Rottner v. Palm Beach Tan Inc., Rottner had purchased a membership from Palm Beach Tan and alleged that the establishment required her fingerprint each time she sought to use its services. She argued that this violated BIPA because she had never been informed of the defendant’s biometric data retention policy and was never asked to sign a release. Palm Beach argued that Rosenbach did not resolve the issue of whether a party who claims only a collection of the party’s biometric data in violation of the act, with no further injury, may recover damages. The Illinois Appellate Court rejected the argument, stating that “the Illinois Supreme Court was clear” that violation of any portion of the act alone was sufficient for standing, and that no actual damages needed to be shown.

In response to the Rosenbach decision, on Feb. 15, 2019, Illinois Senate Bill 2134 was introduced to amend BIPA by specifically deleting language that created a private right of action. The amendment provides that any violation that results from collection of biometric information by a private entity is subject to the enforcement authority of the Department of Labor. An employee or former employee may file a complaint with the department alleging a violation within one year from the date of the violation.

Other states have begun to follow Illinois’ lead. For example, Texas and Washington have passed biometric privacy laws, and the California Consumer Privacy Act goes into effect on Jan. 1, 2020. Additionally, Arizona, Florida, and Massachusetts have proposed legislation addressing biometric privacy. The debate centers on whether to allow only a state’s attorney general to enforce the privacy act or create a private right of action allowing individuals to enforce the act on their own or via class action.

Looking Forward

With the issues inherent in the collection and use of biometrics, insurers must counsel their clients to fully educate themselves about both the latest state laws and best practices. Employers should proactively establish safe practices, using BIPA as guide, and must continually look out for new developments. Regardless of where in the country your business is located, you can expect increased government enforcement and private litigation as more states enact biometric protections, and it’s safe to say that new compliance obligations are also likely. Questions of ethics, privacy, and other risks will only continue to be raised as time goes on, and we anticipate that the courts will be instrumental in helping to set the guidelines for responsible collection, storage, and use of biometric data going forward.

photo
About The Authors
Multiple Contributors
Eileen King Bower

Eileen King Bower is a partner in the Chicago office of Clyde & Co. eileen.bower@clydeco.us

Julie Hawkinson

Julie Hawkinson is a partner at the Los Angeles office of Clyde & Co. julie.hawkinson@clydeco.us

Theresa Le

Theresa Le is senior counsel at the San Francisco office of Clyde & Co. theresa.le@clydeco.us

James Moffitt

James Moffitt is an associate at the Chicago office of Clyde & Co.  james.moffitt@clydeco.us

Sponsored Content
photo
Daily Claims News
  Powered by Claims Pages
photo
About The Community
  CMPL

CLM’s Cyber, Management & Professional Liability Community helps raise awareness of issues and trends in the management & professional liability insurance marketplace, with an emphasis on litigation management through a collaborative effort between insurance companies and brokerages, claims organizations and service providers.

photo
Community Events
  CMPL
No community events