security

Police used it to find suspects, but it created new victims – Cleveland 19 News


(InvestigateTV/Atlanta News First) – “I swear to God I ain’t never been to Louisiana.”

On Nov. 25, 2022, Dekalb County police stopped Randal Reid, 29, along a stretch of I-20. Reid was wanted out of Louisiana, according to a background search officers ran on him during the traffic stop, after two Bayou State jurisdictions accused him of stealing purses.

“You got two theft warrants,” a DeKalb officer told Reid. “They’re both in Louisiana. Who would use your name? Who would get you involved?”

“I don’t even know nobody in Louisiana,” Reid responded.

At the time of Reid’s arrest, neither he nor DeKalb police knew Louisiana law enforcement used facial recognition technology (FRT) to ultimately issue the warrants. Reid’s attorneys now say Louisiana law enforcement used FRT to link him to crimes he did not commit. Reid was released from jail on Dec. 1, 2022.

FRT is software that analyzes facial features and compares one image to another, to confirm a match or offer a limited set of results with similarities. But as more law enforcement agencies use the technology, the search for suspects is leaving behind a trail of new victims.

Several recent studies show using facial recognition technology may contribute to greater racial disparities in arrests.

‘They made a mistake, it’s OK’

Using facial recognition, police in Detroit linked Robert Williams to security video. The video shows a Black male pocketing watches from a Michigan jewelry store.

Williams, his wife Melissa and attorneys have filed a lawsuit in which they accuse detectives of failing to thoroughly investigate beyond using FRT.

Readers Also Like:  Liverpool Women's is First to Complete Modern EPR Milestone USA ... - PR Newswire

“There was no questioning, no asking for an alibi,” Melissa Williams said.

“Arresting me for absolutely no reason other than whatever you seen on a picture, that’s just not real,” Robert Williams, who now wants the technology banned, said.

InvestigateTV obtained body camera footage which shows police arriving at Williams’ home in Farmington Hills, just outside of Detroit, waiting to arrest him. His wife and kids watched from the driveway as he was arrested.

Robert Williams can be heard saying, “They made a mistake. It’s okay. I’ll be back in a minute.” He did not come back for at least two days.

‘Where did you get all this from?’

In February 2019, police in Woodbridge, New Jersey, questioned a man in a Dodge Charger parked outside a hotel. The suspect was accused of stealing from the hotel’s gift shop.

As officers are questioning the driver, he cranked up his engine. “Cut it off now, don’t move, we’ll shoot,” officers yelled. They repeatedly told the driver to stop and put his hands up. The driver eventually struck a car and came close to striking an officer.

Court documents revealed police later used FRT software to run the actual suspect’s fake driver’s license through its system. The results linked Parks, who still describes the charges in disbelief, to crimes he did not commit.

“Aggravated assault with a deadly weapon, shoplifting, eluding, my jaw just dropped,” Parks said. “I was like, where did you get all this from?”

In the cases of Reid, Williams and Parks, all three Black men were jailed, and all were falsely matched.

Readers Also Like:  In Risky Hunt for Secrets, U.S. and China Expand Global Spy ... - The New York Times

Studies show racial disparities

A national study found using facial recognition technology “contributes to greater racial disparities in arrests.” The author, Dr. Thaddeus Johnson, is a professor at Georgia State University and also a former police officer.

Had FRT been available to him during his time as a police officer, Johnson said he probably would have used it, but not to make a final determination in a case.

Johnson’s study examined FRT deployment in about 1,100 cities and subsequent arrests in 2016.

The results illustrate agencies that used FRT had a 55 percent higher arrest rate for Black people and a 22 percent lower arrest rate for White people, compared to agencies that did not use facial recognition.

“Bias can be embedded on the very front end,” Johnson said.

According to the report, the contributing factors in racial disparities included:

  • Black people are “overrepresented” in image databases, like mugshots, so they carry a “greater risk of being misidentified.”
  • A lack of racially diverse software programmers and photos used to train or build algorithms.
  • Absence of federal guidelines on interpreting results
  • The psychological effect of “workers relying more heavily on shortcuts for time-sensitive, high stakes decisions.”

Meanwhile, more recent data by the National Institute of Standards and Technology reveals FRT has a “wide range in accuracy across developers.” For example, systems created in China have more accuracy in identifying Asian faces, according to the study.

But among U.S. systems, the highest false positives were among people of color. “We can use this technology, but we can’t do it at the expense of inequity and discriminatory policing, whether we mean to or not,” Johnson said.

Readers Also Like:  Boom Supersonic Announces New Aircraft, Engine, and Investment ... - PR Newswire

InvestigateTV confirmed the facial recognition company in the arrest of Randal Reid. A spokesperson said their company encourages law enforcement to develop policies and treat results like a lead or tip, not as a deciding factor in a case.

But some agencies don’t have clearly defined policies, the subject of part two in this special investigation.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.