Sydney Kate Watson, asst. arts & living editor

Everyone has had times when being unrecognizable is an advantage. You do not want to speak with anyone because you are emotionally drained, or you are wearing your house shoes to the store. More seriously, perhaps, you are trying to hide from an abuser or stalker. According to the Fourth Amendment, people have the right to “feel secure in their persons.” However, thanks to Clearview AI, that is no longer the case. Anonymity is dead unless the court system and Congress act against Clearview AI and other predictive facial recognition service vendors.

Due to Dr. Brain Carroll, and his guidance, I was able to discover and research the predictive facial recognition service vendor known as Clearview AI. The company was not exposed to the public’s knowledge until Jan 2020 in a New York Times article. To understand the complexity of the issue, I will explain biometrics and what Clearview is doing, touch on the Biometric Information Privacy Act of Illinois (2008) and the importance of the ACLU v. Clearview AI case. To preface, this might seem like a complex issue that might not concern you, but it does concern you. If you enjoy privacy in any capacity, this issue pertains to you. 

Biometrics include aspects of your biology such as fingerprints, eye prints and face prints. These aspects of your biology are almost impossible to change. You could have complex reconstructive facial surgeries, but those surgeries are out of the average person’s reach. This fact makes it imperative for these personal identifiers to be protected. Clearview AI is an application that scraps privately-owned social media sites for images. They then use their algorithm to create a mathematical face print of the people’s faces in that image. The company then stores these face prints in a database, one that has been breached and is subject to other breaches. In total, the company has about three billion images stored in their database.

Through the Clearview AI application, a user can scan someone’s face, and the app predicts who this person is by comparing their face print with their database of faceprints. The technology is so advanced that it can identify people who accidentally photobomb someone else’s photo. While their head might be the size of the end of a fingernail, Clearview AI can identify them. Their application is sold to law enforcement, but originally, before being exposed, anyone could pay for the application. 

Clearview AI is doing this without consent from anyone. Clearview claims that since the images are public, it is their First Amendment right to use them. However, I argue that their use of predictive technology is a violation of the Fourth Amendment. Yes, you can go down the street and see different faces and recognize people. Although, your brain is not using an algorithm to map distinctive points on their face, and then going through a backlog of faces you have seen that match this person’s faceprint. 

Also, a National Biometric Information Privacy Act (BIPA), that is similar to the Illinois BIPA, is needed to protect Americans’ biometrics. Illinois’s BIPA is one of the only laws in the country that gives citizens power over what happens to their biometrics. Its three major points that make it an archetype are: the collector of biometrics must notify the consumer, in writing, that their biometric information is being collected; inform the consumer of the purpose of the biometric collection, including how it will be used and what length of time it will be stored in a database; and the biometric collector must receive written consent from the consumer for the practices listed in the consumer notice. 

Clearview AI’s actions are questionable. In May of 2020, the American Civil Liberties Union (ACLU) filed against Clearview AI in Illinois for violating Illinois’s BIPA. Clearview AI has asked the court to dismiss the case on the grounds of the First Amendment. However, the ACLU, who represents undocumented immigrants, sex workers and survivors of abuse, finds that Clearview AI’s predictive face print technology is not worthy of First Amendment protection. If this case is dismissed, privacy law in America would change forever, and anonymity would die. Another interesting aspect, is that this case has the potential to go to the United States Supreme Court, especially considering Clearview AI has hired Floyd Abrahms, an infamous First Amendment lawyer who participated in New York Times v. United States (1971). 

On the other hand, if the case does proceed through the court system, and the hearing is ruled in favor of Clearview AI, our privacy laws will change for the worse as well. More companies like Clearview AI will appear and continually harbor and use our biometric information without a second thought. The United States will be a world where no one can walk down the street privately, stalkers and abusers will gain power and mobs will have the ability to identify and revolt against immigrants, sex workers and members of the LGBTQ + community. No one will be safe in their own skin. 

I do not want to live in a world without privacy, and I hope you do not want too either. Please educate yourself on the issue and write to your politicians urging them to pass a National BIPA. If you are interested in learning more about privacy law or Clearview AI, I am always happy to discuss the situation and the case law supporting my position. We cannot let anonymity die in darkness. 

Posted by Campus Carrier

Leave a Reply