Every day, your movement is tracked. Your purchases are logged, your searches saved. And increasingly, your face is scanned.
Facial recognition technology is becoming more widespread daily, and governments are finding new applications in the midst of the coronavirus pandemic. Privacy International reports that 24 countries have already implemented location tracking to help ensure compliance with quarantines.
Were you thinking that face masks might help protect your privacy? China’s facial recognition algorithms have already figured out a way around them. In January, The New York Times reported that a company called Clearview AI has created a database that makes it possible to snap a photo of a stranger and reveal that person’s identity.
The technology was developed using more than three billion images scraped from public social media accounts by Hoan Ton-That, an Australian who HuffPost revealed has collaborated with anti-immigration alt-right operatives. Elements of Clearview AI are in use by more than 600 law enforcement agencies in North America—including the FBI, Department of Homeland Security, and ICE.
So can we resist the surveillance society? Should we?
Kate Rose says yes.
“I think you have a right to consent to how your information is used, especially if it’s meant to be at some point used against you or used extrajudicially,” says Rose, the cybersecurity analyst and fashion designer who founded Adversarial Fashion, a line of surveillance-resistant clothing. Its wares include masks meant to block facial recognition cameras, and shirts patterned with fake license plates meant to feed bad data into automated license plate readers.
Rose’s concern about extrajudicial use of personal data is more plausible than ever in the age of coronavirus lockdowns.
Politico reported in late March that the Department of Justice has asked Congress pass a law allowing indefinite detention without trial of U.S. citizens during national emergencies. (The legislation has yet to advance.) Unauthorized movements picked up by surveillance could theoretically be a pretext for such indefinite detention.
“Privacy rights need to be more enshrined,” says Rose, “in terms of protecting your right to any data collected about you [requiring] a warrant before it is used.”
Rose is one of several designers trying to fight surveillance with fashion.
While her license plate shirts and dresses disseminate bad data, other anti-surveillance designers use fashion as a form of obstruction, such as camouflaging makeup or sunglasses that confuse facial recognition systems.
“I really love how people are exploring the different ways to counter surveillance technology and to empower people to do so,” says Electronic Frontier Foundation (EFF) researcher Dave Maass. “But at the end of the day, people should not have to wear a mask or put on face paint or wear, like, complicated t-shirt patterns in order to protect their privacy. Our government should be protecting our privacy.”
Maass and his EFF colleagues successfully lobbied the California legislature to pass a law that, starting in 2020, puts a three-year moratorium on law enforcement’s use of facial recognition technology, including those departments who were experimenting with Clearview AI. It’ll mean that law enforcement agencies in San Diego county will have to stop using a shared facial identification system available to officers in hand-held tablets.
The San Diego Sheriff’s Department “was one of the first agencies that we identified…using mobile biometric technology…face recognition that they could use from the palms of their hands,” says Maass. The data didn’t stay local. According to Maass, San Diego, a border county, regularly shared access with the federal government, including Border Patrol and ICE.
“And we don’t know how those agencies use that technology. We do know they used it, but we’d have no idea what their purposes were,” says Maass.
San Francisco and Oakland have outright banned the use of facial recognition technology by law enforcement. Some technologists think such bans are overreactions.
“Suspending A.I. [artifical intelligence] facial recognition like San Francisco and Oakland…is idiocy to be honest. And lives will be lost,” says Zoltan Istvan, a tech writer and self-described transhumanist who is currently seeking the Libertarian Party’s vice-presidential nomination. Istvan believes that humans should celebrate and embrace the disruptive capabilities of technology to modify the human body and experience. He even implanted an RFID chip in his hand that allows him to unlock his front door.
Facial recognition technology “is going to be very useful to the human race,” says Istvan, “but we just kinda got to get over it being creepy.”
Istvan envisions authorities using facial recognition and other artificial intelligence–driven surveillance tools to prevent terrorist attacks by recognizing abnormal behaviors or suspicious individuals in crowds. Or to aid the government in fighting human trafficking.
Governments around the world are deploying other biometric surveillance tools as well, such as gait recognition and scanning for elevated body temperatures to isolate feverish individuals in a pandemic.
“Let us look at what [surveillance] can do for overcoming criminality in our cities. Let us look at what it can do for the overall safety,” says Istvan.
FaceMe is one example of such a security application. The developers originally marketed the software for virtual makeup demonstrations before it evolved into a product serving a wide range of uses, such as logging into apps, entering a secure facility, and identifying intruders. FaceMe’s general manager Richard Carriere says the software has a precision level of up to 99.58 percent, the only non-Chinese or Russian company with such accurate results.
Although the majority of the company’s clients are in the private sector, they have supplied technology to governments around the world. Carriere agrees with Istvan that facial recognition technology could be a giant boon to public safety while having the benefit of decreasing the likelihood of police interactions turning violent.
“If I’m a citizen and cops come to me, I’d be very happy for them to know who I am even before they come to me,” says Carriere.
Carriere pledges that the company won’t sell its technology to repressive governments or agencies.
“I’d like to believe that we would only associate ourselves with police forces or law enforcement organizations that are respectful of individual rights,” says Carriere.
But U.S. law enforcement agencies are already showing a lack of accountability in how they use facial recognition technology. The police department in Chula Vista, California, failed to properly report to a federal oversight committee how it was using a facial recognition program, according to a fired whistleblower.
The Chula Vista Police Department declined our interview request.
“Police are very enthusiastic about adopting the technology, but they’re not very enthusiastic about doing the due diligence of recording when this technology has been used, when it has been accessed, auditing the use of the technology, doing all the things that you would need to do to protect people’s data,” says Maass. “They want to collect it all, but they don’t really care about protecting it all.
Maass worries about China’s use of facial recognition surveillance in conjunction with a state-run social credit system, which assigns citizens a numerical score based on their behavior. China has also rolled out increased pandemic-related surveillance that monitors for fevers and flags individuals not wearing protective face masks during an outbreak.
“The thing that we can learn from China is that this surveillance, as it continues to grow, is going to be less and less about public safety and more and more about controlling people,” says Maass.
But Istvan believes that it’s possible to deploy facial recognition surveillance without emulating China.
“I think the social credit system that China is using is absolutely awful,” says Istvan. “They’re setting such a bad example for the rest of the world that everyone’s turning their back against A.I. facial recognition. There is a good way to use it.”
Istvan believes that, ultimately, our entire conception of privacy will need to be revised.
“I believe in a society that’s totally transparent, a society where sort of everybody can see what everybody is doing,” says Istvan, who advocates a law requiring body cams that constantly record police officers while on duty and surveillance of all political figures when they are acting in an official capacity. “Privacy, I believe, really does steal our liberty away. It’s transparency that’s going to give us all the freedoms we want.”
“I do think conceptions of privacy are changing, but I think they’re strengthening,” says Maass. “Post–Clearview AI…people are concerned and outraged…and people will probably make different decisions on how they control their data online as a result of it.”
Rose thinks that as the technology becomes more powerful and present, Americans will need to take a page from the protesters in Hong Kong, who have used face masks, encrypted communication, and, most importantly, mass disobedience to resist authoritarian control.
“The…anti-surveillance actions that don’t matter by yourself, when you hit a critical mass of people, matter a lot,” she says, pointing to the ability of Hong Kong protesters to sustain their protest through mass participation and decentralized coordination. “I think that kind of belief in your power, even if you think it might not work 100 percent of the time…you together have this tremendous power.”
Rose’s aim isn’t just to design clothing that thwart today’s systems but to cultivate a community that continually develops new methods to confound the surveillance state as its tools continue evolving.
“It’s a really important opportunity for us to try and get as far ahead as we can before we begin playing catch up again,” says Rose.
Produced by Zach Weissmueller and Justin Monticello. Opening graphics by Lex Villena. Camera by James Lee Marsh, John Osterhoudt, Weissmueller, and Monticello. Hong Kong camerawork by Edwin Lee.
Photo credits: “Thermal surveillance,” by Dario Sabljak/agefotostock/Newscom; “Surveillance camera,” Caro/Sorge/Newscom; “Chula Vista facial recognition tablet,” Howard Lipin/TNS/Newscom