Connect with us

Feature

Man arrested because face recognition can’t tell Black people apart [VIDEO]

ACLU Calls on Lawmakers to Immediately Stop Law Enforcement Use of Face Recognition Technology

Robert Williams, a Black man and Michigan resident, was wrongfully arrested because of a false face recognition match, according to an administrative complaint filed today by the American Civil Liberties Union of Michigan. This is the first known case of someone being wrongfully arrested in the United States because of this technology, though there are likely many more cases like Robert’s that remain unknown.

Detroit police handcuffed Robert on his front lawn in front of his wife and two terrified girls, ages two and five. The police took him to a detention center about forty minutes away, where he was locked up overnight in a cramped and filthy cell. Robert’s fingerprints, DNA sample, and mugshot were put on file. After an officer acknowledged during an interrogation the next afternoon that “the computer must have gotten it wrong,” Robert was finally released — nearly 30 hours after his arrest. The government continues to stonewall Robert’s repeated attempts to learn more about what led to his wrongful arrest, in violation of a court order and of its obligations under the Michigan Freedom of Information Act.

Robert is keenly aware that his encounter with the police could have proven deadly for a Black man like him. He recounts the whole ordeal in an op-ed published by the Washington Post and a video published by the ACLU.

“I never thought I’d have to explain to my daughters why daddy got arrested,” says Robert Williams in the op-ed. “How does one explain to two young girls that the computer got it wrong, but the police listened to it anyway?”

While Robert was locked up, his wife Melissa had to explain to his boss why Robert wouldn’t show up to work the next morning. She also had to explain to their daughters where their dad was and when he would come back. Robert’s daughters have since taken to playing games involving arresting people, and have accused Robert of stealing things from them.

Robert was arrested on suspicion of stealing watches from Shinola, a Detroit watch shop. Detroit police sent an image of the suspect captured by the shop’s surveillance camera to Michigan State Police, who ran the image through its database of driver’s licenses. Face recognition software purchased from DataWorks Plus by Michigan police combed through the driver’s license photos and falsely identified Robert Williams as the suspect.

Based off the erroneous match, Detroit police put Robert’s driver’s license photo in a photo lineup and showed it to the shop’s offsite security consultant, who never witnessed the alleged robbery firsthand. The consultant, based only on a review of the blurry surveillance image, identified Robert as the culprit.

“Every step the police take after an identification — such as plugging Robert’s driver’s license photo into a poorly executed and rigged photo lineup — is informed by the false identification and tainted by the belief that they already have the culprit,” said Victoria Burton-Harris and Phil Mayor, attorneys representing Robert Williams, in an ACLU blog post published today. “Evidence to the contrary — like the fact that Robert looks markedly unlike the suspect, or that he was leaving work in a town 40 minutes from Detroit at the time of the robbery — is likely to be dismissed, devalued, or simply never sought in the first place…When you add a racist and broken technology to a racist and broken criminal legal system, you get racist and broken outcomes. When you add a perfect technology to a broken and racist legal system, you only automate that system’s flaws and render it a more efficient tool of oppression.”

Numerous studies, including a recent study by the National Institutes of Science and Technology, have found that face recognition technology is flawed and biased, misidentifying Black and Asian people up to 100 times more often than white people. Despite this, an untold number of law enforcement agencies nationwide are using the technology, often in secret and without any democratic oversight.

“The sheer scope of police face recognition use in this country means that others have almost certainly been — and will continue to be — misidentified, if not arrested and charged for crimes they didn’t commit,” said Clare Garvie, senior associate with Georgetown Law’s Center on Privacy & Technology in an ACLU blog post.

The ACLU has long been warning that face recognition technology is dangerous when right, and dangerous when wrong.

“Even if this technology does become accurate (at the expense of people like me), I don’t want my daughters’ faces to be part of some government database,” adds Williams in his op-ed. “I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like. I don’t want this technology automating and worsening the racist policies we’re protesting.”

The ACLU has also been leading nationwide efforts to defend privacy rights and civil liberties against the growing threat of face recognition surveillance, and is calling on Congress to immediately stop the use and funding of the technology.

“Lawmakers need to stop allowing law enforcement to test their latest tools on our communities, where real people suffer real-life consequences,” said Neema Singh Guliani, ACLU senior legislative counsel. “It’s past time for lawmakers to prevent the continued use of this technology. What happened to the Williams family should never happen again.”

Already, multiple localities have banned law enforcement use of face recognition technology as part of ACLU-led campaigns, including San Francisco, Berkeley and Oakland, CA, as well as Cambridge, Springfield, and Somerville, MA. Following years of advocacy by the ACLU and coalition partners, pressure from Congress, and nationwide protests against police brutality, Amazon and Microsoft earlier this month said they will not sell face recognition technology to police for some time. They joined IBM and Google who previously said they would not be selling a general face recognition algorithm to the government. Microsoft and Amazon have yet to clarify their positions on sale of the technology to federal law enforcement agencies like the FBI and the DEA.

The ACLU is also suing the FBI, DEA, ICE, and CBP to learn more about how the agencies are using face recognition and what safeguards, if any, are in place to prevent rights violations and abuses. And the organization has taken Clearview AI to court in Illinois over its privacy-violating face recognition practices.

The op-ed by Robert Williams is here: https://www.washingtonpost.com/opinions/2020/06/24/i-was-wrongfully-arrested-because-facial-recognition-why-are-police-allowed-use-this-technology/.

The administrative complaint filed today was first reported by the New York Times: https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html.

Newsletter Signup

Join our email list to stay connected.

Written By

Advertisement
Advertisement

Advertisement

Newsletter Signup

Join our email list to stay connected.

©2019 Atlanta Tribune: The Magazine

Connect
Newsletter Signup

Join our email list to stay connected.

Verified by MonsterInsights