Amazon touts its Rekognition facial recognition system as “simple and easy to use[1],” encouraging customers to “detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases.” And yet, in a study[2] released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets[3] Rekognition to law enforcement agencies across the US, that’s simply not good enough.

The ACLU study also illustrated the racial bias[4] that plagues facial recognition today. "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote[5] ACLU attorney Jacob Snow. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that."

Facial recognition technology’s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru published findings[6] that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in people of color than in white people. In a June evaluation of Amazon Rekognition, Buolamwini and Inioluwa Raji of the Algorithmic Justice League found similar built-in bias. Rekognition managed to even get Oprah wrong[7].

“Given what we know about the biased history and present of policing, the concerning performance metrics of facial analysis technology in real-world pilots, and Rekognition’s gender and skin-type accuracy differences,” Buolamwini wrote in a recent letter[8] to Amazon CEO Jeff Bezos, “I join the chorus of dissent in calling Amazon to stop equipping law enforcement

Read more from our friends at Wired.com