upper waypoint

A New California Bill Would Regulate Police Use of Facial Recognition. These Falsely Arrested Black Men Say It's Not Enough

Save ArticleSave Article
Failed to save article

Please try again

A video surveillance camera is seen hanging with a blurred person in the distance.
A video surveillance camera hangs from the side of a building in San Francisco on May 14, 2019. (Justin Sullivan/Getty Images)

In 2019 and 2020, three Black men were accused of and jailed for crimes they didn’t commit after police used facial recognition to falsely identify them. Their wrongful arrest lawsuits are still pending, but their cases bring to light how AI-enabled tools can lead to civil rights violations and lasting consequences for the families of the accused.

Now, all three men are speaking out against pending California legislation making it illegal for police to use facial recognition technology as the sole reason for a search or arrest. Instead, it would require corroborating indicators. The problem, critics say, is that a possible face recognition “match” is not evidence — and that it can lead investigations astray even if police seek corroborating evidence.

The state Assembly last month passed Assembly Bill 1814 with a 70–0 vote. On Tuesday, it faced a contentious hearing in the Senate Public Safety Committee.

Such a bill “would not have stopped the police from falsely arresting me in front of my wife and daughters,” Robert Williams told CalMatters in a statement. In 2020, Detroit police accused Williams of stealing watches worth thousands of dollars — the first known instance of false arrest involving facial recognition in the United States — after facial recognition technology matched a surveillance video to a photo of Williams in a state database. Investigators put his photo in a “six-pack lineup” with five others, from which a security guard, who had seen a surveillance image and not the theft itself, selected him.

“In my case, as in others, the police did exactly what AB 1814 would require them to do, but it didn’t help,” said Williams, who is Black. “Once the facial recognition software told them I was the suspect, it poisoned the investigation. This technology is racially biased and unreliable and should be prohibited.”

Sponsored

He added, “I implore California lawmakers to not settle for half measures that won’t actually protect people like me.”

The first facial recognition searches in the United States occurred over two decades ago. It’s a process that begins with a photo of a suspect, typically taken from security camera footage. Face recognition on your iPhone is trained to match your photo, but the kind used by law enforcement agencies searches databases of mug shots or driver’s license photos that can contain millions of photos and can fail in numerous ways. Tests by researchers have shown that the technology is less accurate when attempting to identify people with darker skin and those who identify as transgender. Accuracy has also been shown to decrease when a probe image of a suspect is low quality or if the image in a database is outdated.

related coverage

After a computer assembles a list of possible matches from a database of images, police pick a suspect from an array of candidates, then show that photo to an eyewitness. Eyewitness testimony has proven to be a leading cause of wrongful convictions in the United States.

Because investigators may use facial recognition to identify possible suspects but ultimately rely on eyewitness testimony, the technology can play a role in a criminal investigation without the accused and defense attorneys knowing about it.

Directives not to treat a possible match by a facial recognition system as the sole basis for an arrest sometimes don’t make a difference. They failed to do so, for instance, in the case of Alonzo Sawyer, a man who was falsely arrested near Baltimore and spent nine days in jail.

Njeer Parks, who spent nearly a year fighting allegations that he stole items from a hotel gift shop in New Jersey and then nearly hit a police officer with a stolen vehicle, came out in opposition to the California bill in a video posted on Instagram last week. The police “are not going to do their job if the AI is saying ‘It’s him’ already. That’s what happened to me.”

“I got lucky,” he told CalMatters in a phone interview, noting that a store a receipt he saved had exonerated him and kept him out of prison. “I don’t want to see anybody sitting in jail for something they didn’t do.”

The attorney for Michael Oliver, a Black man who was wrongly accused of assaulting a high school teacher in Detroit in 2020, is scheduled to testify at Tuesday’s legislative hearing in Sacramento, the American Civil Liberties Union said.

Supporters of the bill include the California Faculty Association and the League of California Cities. The California Police Chiefs Association argues that facial recognition can reduce criminal activity and provide police with actionable leads and that such technology will be important as California looks to host international events such as the 2026 World Cup and the 2028 Summer Olympics in Los Angeles.

“Across the country, real-world examples of law enforcement using [facial recognition technology] to solve major crimes showcases just how important this new technology can be towards protecting our communities,” the association has argued. It cited cases in which it says facial recognition played a role in identifying the guilty, including a newspaper headquarters shooting in Maryland and a rape in New York.

Facial recognition alone should never lead to false arrests, Jake Parker with the Security Industry Association told members of the California Assembly a few weeks ago. That’s why AB 1814 is meant to corroborate investigative leads with evidence, not just a possible facial recognition match.

“There’s a clear need to bolster public trust that this technology is being leveraged accurately, lawfully, and in an effective way that’s also limited and non-discriminatory in a way that benefits our communities,” he said. “So we believe AB 1814 will help bolster this trust, and for that reason, we urge you to support this bill in its current form.”

However, more than 50 advocacy organizations — including the ACLU, Access Reproductive Justice and the Electronic Frontier Foundation — signed a letter last week opposing the bill. They called facial recognition software unreliable, a proven threat to Black men, and a potential threat to protesters, people seeking abortions, and immigrant and LGBTQ communities.

“By allowing police to scan and identify people without limitation, AB 1814 will also increase unnecessary police interactions that too often have the potential to escalate into fatal encounters. This will remain true regardless of how accurate face recognition technology becomes,” the organizations said in a letter. “There is no way for people to find out if facial recognition is used against them and no mechanism to make sure the police comply with the law.”

The author of the bill, San Francisco Democratic Assemblymember Phil Ting, also authored a 2019 bill that initially placed a permanent ban on police use of body camera footage with facial recognition. That was amended to a temporary ban, which ended in January 2023.

Ting told CalMatters he’s uncomfortable with the fact that California currently has no limits on how law enforcement agencies use the technology.

He said in a statement that his bill “simply requires officers to have additional evidence before they can proceed with a search, arrest, or affidavit for a warrant. I believe having a precautionary step can help protect people’s privacy and due process rights while still allowing local governments to go further and pursue their own facial recognition bans.”

Ting’s city of San Francisco became the first major city in the nation to ban face recognition in 2019. However, The Washington Post last month reported that San Francisco police have, on multiple occasions, gone around restrictions by requesting that law enforcement in neighboring cities conduct the searches for them.

Former San Francisco District Attorney Chesa Boudin, who was recalled by voters in 2022, says there have almost certainly been false arrests associated with the use of facial recognition in California, but they would remain unknown to the public unless prosecutors filed charges and the accused later went to trial with a civil lawsuit seeking damages. Often, such cases would be settled out of court.

“We absolutely need a legislative, regulatory framework for these technologies, but I don’t think AB 1814 is adequate in terms of protecting civil liberties or providing meaningful guardrails or safeguards for the use of these new and powerful technologies,” said Boudin, who now directs UC Berkeley’s Criminal Law & Justice Center.

The staff of the Senate Public Safety committee staff has suggested amending the bill to state that judges should not grant warrant applications by law enforcement agencies based solely on a possible match by face recognition technology.

Sponsored

Lawmakers have until the end of the legislative session in August to decide whether to pass AB 1814.

lower waypoint
next waypoint