This Senate Bill Would Ban Federal Use of Facial Recognition

On Wednesday, Senators Jeff Merkley (D-OR) and Cory Booker (D-NJ) introduced legislation to place a moratorium on the use of facial recognition by the federal government or with federal funds—unless Congress passed regulations for the technology.

The Ethical Use of Facial Recognition Act aims to create a 13 member congressional commission representing interested parties—including law enforcement, communities subjected to surveillance, and privacy experts.

“Facial recognition technology works well enough that it’s a huge temptation for the government to try and track Americans,” Merkley told Motherboard. “This is Big Brother on steroids. I don’t want America to become a police state that tracks us everywhere we go.”

“Facial recognition technology has been demonstrated to be often inaccurate—misidentifying and disproportionately targeting women and people of color,” Booker said in a statement. “To protect consumer privacy and safety, Congress must work to set the rules of the road for responsible uses of this technology by the federal government.”

There is a notable exception in the legislation, however: law enforcement may still use the technology with court warrants.

“While a good first step, the bill’s exceptions—including allowing police to use this technology with only a warrant—fails to fully account for the realities of this mass surveillance tool,” ACLU Senior Legislative Counsel Neema Singh Guliani said in a press release.

That exception is a huge one, especially given that the close relationships between police departments and commercial vendors will only grow more intimate once the moratorium closes the door on federal funds. Civil liberties organizations have said that Amazon’s Rekognition and Clearview AI both feature incredibly powerful facial recognition technologies that pose a threat to marginalized groups, whether or not they are accurate. But this hasn’t stopped their adoption by police departments or sales pitches making false claims about accuracy. In the example of Clearview AI, these claims come despite David Scalzo, whose firm was an early investor in Clearview, acknowledging that databases of billions of faces paired with racist and sexist algorithms “might lead to a dystopian future or something.”

“I’m very concerned about the databases that are being compiled. Our faces are being sold. I personally would like—if I could wave a magic wand right now, I’d put a complete stop on that as well,” Merkley said. “But I think tackling the Big Brother government aspect and using that as an opportunity for people to come up to speed with how dangerous this technology is, is probably a good way to go about it.”

One notable consequence of the moratorium could be the undermining of some operations by Palantir, the surveillance firm that recently admitted its role in helping Immigration and Customs Enforcement separate families and deport migrants. While the company does not itself offer facial recognition technology, its software is used to analyze that data and further flesh out surveillance networks used by local, state, and federal authorities.

“I think it’d be an overreach at this point, both a political overreach and probably a policy overreach, to try to dictate what states do,” Merkley said. “But having the conversation at the national level will hopefully stimulate a lot of other conversations.”

en_USEnglish