You may have heard that facial recognition systems are racially biased, incorrectly identifying Black and Asian people’s faces from 10 times to 100 times as often as they get white people’s faces wrong. As those facial recognition systems are increasingly being used in law enforcement, that has real-world effects, and you should know the name Robert Williams, and know his story.
Williams is a Black man from Farmington Hills, Michigan, who was arrested for shoplifting on the basis of a facial recognition algorithm misidentification so glaring that when police bothered to look at the grainy picture of the shoplifter next to Williams’ actual face, he heard one say “I guess the computer got it wrong.” But that didn’t happen until he had been held overnight, and he still wasn’t released for hours after. Williams got legal help from the American Civil Liberties Union, and he’s telling his story to draw attention to the problem.
The ACLU of Michigan is calling on the Detroit Police Department to “stop using facial recognition technology as an investigatory tool,” because “the facts of Mr. Williams’s case prove both that the technology is flawed and that DPD investigators are not competent in making use of such technology.”
The thing is, as Williams writes, “My daughters can’t unsee me being handcuffed and put into a police car. But they can see me use this experience to bring some good into the world. That means helping make sure my daughters don’t grow up in a world where their driver’s license or Facebook photos could be used to target, track or harm them.”
The Detroit police mishandled Williams’ case in a series of other ways. Not only was the algorithm’s identification wrong, but the file the police got identifying him as a suspect said, in bold letters, “This document is not a positive identification” but rather “is an investigative lead only and is not probable cause for arrest.” Williams’ driver’s license photo was included in a photo line-up and picked out by Katherine Johnston, a loss prevention investigator working for Shinola, the store where the shoplifting occurred. But Johnston had only seen the blurry surveillance video of the theft, not been an in-person witness, not that eyewitness testimony is particularly accurate.
The result is that Williams was arrested in his front yard in front of his wife and two young children. He went through a night that would be traumatic for anyone, but especially a Black man who knows how these stories too often end. “I was patted down probably seven times, asked to remove the strings from my shoes and hoodie and fingerprinted,” he wrote in The Washington Post. “They also took my mugshot. No one would tell me what crime they thought I’d committed. A full 18 hours went by. I spent the night on the floor of a filthy, overcrowded cell next to an overflowing trash can.”
Williams is the first person known to have been falsely arrested because of a flawed, racist facial recognition algorithm. But if he hadn’t heard one of the officers questioning him refer to “the computer,” he wouldn’t have known it, either. “I strongly suspect this is not the first case to misidentify someone to arrest them for a crime they didn’t commit,” Clare Garvie, of Georgetown University’s Center on Privacy and Technology, told The New York Times. “This is just the first time we know about it.”
It needs to stop now. This is not a signal to refine these facial recognition programs to be slightly less racist, as Williams pointed out. “Even if this technology does become accurate (at the expense of people like me), I don’t want my daughters’ faces to be part of some government database,” he wrote. “I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like. I don’t want this technology automating and worsening the racist policies we’re protesting. I don’t want them to have a police record for something they didn’t do—like I now do.”