Black man in New Jersey sues after false facial recognition lands him in jail

KFSN logo
Wednesday, December 30, 2020
NJ law enforcement arrest innocent man using facial recognition software
A Black man is suing a New Jersey police department after a bad facial recognition match landed him in jail for a crime he did not commit.

WOODBRIDGE, New Jersey -- Law enforcement's use of facial recognition technology is under scrutiny after an innocent man was jailed in New Jersey for a crime he didn't commit.



Critics and studies say the technology is inaccurate and disproportionately harms Black people and other minorities.



Nijeer Parks is suing the police after a bad facial recognition match landed him in jail.



In 2019, Parks was falsely accused of shoplifting and trying to hit an officer with his car, even though he was 30 miles away at the time of the incident.



He spent 10 days in jail and paid thousands of dollars in legal fees. The case was eventually dismissed for lack of evidence.



Parks is now suing police and the city of Woodbridge, New Jersey for a violation of his civil rights, false imprisonment and false arrest.



He is the third person known to be wrongfully arrested based on faulty facial recognition technology.



Parks had been incarcerated before, and considered a plea deal even though he was innocent, so he wouldn't run the risk of getting a long prison sentence.



"I said, 'No, that's not me.' He turns another paper over and says, 'I guess that's not you either?' I picked that paper up and held it to my face and said, 'This is not me.' I said, 'I hope you don't think all Black people look alike,'" another victim of false facial recognition, Robert Williams, recalled.



In June of this year, Williams spent 30 hours in a Michigan detention facility after police near Detroit incorrectly identified him as the suspect in a shoplifting case.



The only evidence linking Williams to the crime was the false facial recognition.



"I can't even put it into words," Williams said. "It was one of the most shocking things that I ever had happen to me."



Police departments around the country have been using facial recognition technology for the last several years.



A federal study from 2019 found a majority of the software shows racial bias, with higher rates of false-positives for Black and Asian faces compared to white faces.

Copyright © 2024 KFSN-TV. All Rights Reserved.