Facial Recognition Technology 

If a person’s birthdate, social security number, health records, and bank account numbers are protected somewhat by privacy laws, why not images of his face?

For now, facial recognition can be used to unlock technology such as iPhones, to match people with their passport identifications for airport security, and to verify identity for entry to secure facilities like military bases and research facilities. Most of those uses are consensual and limited and created with the knowledge of the individual.

Each face has a unique set of characteristics. When paired with other data that can be accessed through public databases, it’s possible to create a nearly-complete composite of an individual, including where he can be found. It isn’t difficult, given the information lost in cyber attacks on financial databases like Experian, to create a realistic fake I.D. or passport with a headshot found in a facial recognition database.  Add in some of the latest cutting-edge video editing technology and we’re teetering on the edge of a dystopian novel-turned-reality.

Indeed, many people are finding out too late that their personal photos have been scraped from cloud storage sites like Flikr to benefit a commercial facial recognition database – or that your neighbor’s Nest doorbell identifies you when you walk past, or that airlines are able to access Homeland Security’s facial recognition database in lieu of boarding passes. Even legislators were alarmed about the potential of identity theft when a government contractor’s database of faces and related license plates was hacked and thousands of images stolen. Suddenly, the Fourth Amendment to the U.S. Constitution, which promises a right to privacy, as well as the First Amendment’s freedom of speech and association, are being tested against the latest technologies.

One source says that half of all American adults are identifiable by facial recognition databases. That included the 4,000 people arrested by New York authorities when they researched fake driver’s licenses by comparing the photos and finding multiple duplicates.

Laws governing facial recognition

There are currently few laws governing the use of facial recognition databases and technology in the U.S., even in courtrooms. At least 16 states share driver’s license photos with the F.B.I. for their searchable database. The technology has been used at big events like the Super Bowl to scan crowds for wanted criminals and terrorists but is also being used commercially, and usually without the individual’s knowledge or consent.

Europeans may have their right to digital privacy – and possibly getting their images scrubbed from commercial (but not government) databases – covered under the General Data Protection Regulation (GDPR) laws, but Americans do not. However, that privacy act may not extend to law enforcement, as a dispatch published by Time magazine reported, when police in a town near London (where there are as many as 500 security cameras recording peoples’ everyday movements)  forcibly photographed a man and give him a misdemeanor fine for covering his face rather than allowing cameras to capture his image.

In the U.S., California is leading the charge for biometric privacy, including fingerprint, voice recognition, retinal scans, and facial recognition data. Called the California Consumer Privacy Act (CCPA) it is similar to the GDPR in that individuals are in charge of their personal data. Under this law the data cannot be collected without express consent, and it cannot be retained by companies that do not have a relationship with the individual. It takes effect in January 2020. Since the creation of this law, San Francisco has enacted local legislation that prohibits nonconsensual collection and use of biometric data, including facial recognition by the police.

Illinois, Washington, and Texas have Biometric Information Privacy Act laws that require express consent to collection of personal data, secure storage, and a requirement for destroying the data. Illinois’ law allows individuals to sue for damages if such data is misused.

Big data sees opportunities

Companies like Amazon are touting their technology to investors and customers as tools for verifying identity of mobile banking users, indexing image files, locating human trafficking victims, and for nabbing wanted criminals. What they could sell widely is the opportunity for businesses to use facial recognition to identify special categories of customers, information that may be used to match the customer with public records and social media information to create a profile including income, age, and shopping habits.

Pitfalls of facial recognition

In reality nascent facial recognition technology still turns out many erroneous matches, particularly for people of color. It was used to try a man on a drug charge in Florida although his attorneys discovered that the photo police discovered through facial recognition technology had a low probability of being accurate. Regardless, the state of Florida has more than 200 law enforcement officials employing sketchy facial recognition matches that could put people in prison despite complaints from civil rights organizations like the ACLU.

Another young man was arrested and charged in Boston with theft from an Apple store on a day when he was at his high school prom. He filed a $1 billion lawsuit charging that Apple misidentified him as the person who stole items from stores in New York and Boston. The store’s facial recognition technology matched the face of the thief with the wrong person’s identification, he claims, causing him anxiety and pain when he was accused.