Police Facial Recognition Accuracy

The cluster discusses the use of AI facial recognition by police, focusing on high false positive rates that risk wrongful arrests and suspicions, while comparing it to error-prone traditional policing methods like eyewitnesses and lineups.

📉 Falling 0.4x AI & Machine Learning
2,843
Comments
20
Years Active
5
Top Authors
#7257
Topic ID

Activity Over Time

2007
1
2008
4
2009
14
2010
14
2011
33
2012
49
2013
98
2014
85
2015
168
2016
160
2017
102
2018
234
2019
300
2020
319
2021
368
2022
218
2023
246
2024
233
2025
181
2026
16

Keywords

e.g AI AAAS arstechnica.com nbcnews.com ML media.ccc ID SWAT CSI police false positives positives false innocent suspects innocent people officers arrest suspect

Sample Comments

matheusmoreira Apr 23, 2021 View on HN

All sorts of police field tests have significant rates of false positives. People actually get arrested on the basis of such "evidence" all the time. The justice and law enforcement systems essentially operate on the notion that these things are "good enough".

mnm1 Jul 6, 2019 View on HN

The system performs much better than the existing tool of closing ones eyes and randomly picking a picture with your finger too that police have been using since the dawn of their existence. Doesn't mean it performs well because it outperforms a system that never worked to begin with like that or like drug dogs. I'm sure soon it'll be treated like it's 100% accurate just like the dogs despite its abysmal success rate. But I'm sure that's fine. So what if 81% of the

burnished Jun 13, 2023 View on HN

Theres already precedent with police and judiciary misusing technology they don't understand, I suspect the underlying issue is that this person would not trust government employees generally with a tool that issues false positives and requires discretions.

high_na_euv Oct 21, 2024 View on HN

AI sucks, but on the other handJudges and police officers arent 100% accurate too

mrweasel Jul 26, 2018 View on HN

The problem is that it's only 95% perfect in this case (28 out of 534, being identified as criminals). That a pretty big margin of error when the result is possible arrest. The scope of its use is also in question, sure if you looking a someone who kidnapped a child, 95% is good enough. If you use it for any minor violation you risk harassing a large percentage of the population, who then will need to prove that it wasn't them. Some will flat out deny being the guilty party, and there&

JohnMakin Jan 14, 2025 View on HN

In this specific example, police coercing or using shaky evidence in a perp lineup has always been known and a big problem and led to a lot of false convictions. This just has the added wrinkle of "AI" giving it more credibility than it should. You can see, even in this story, the victim tried to say he wasn't really sure and they basically ignored him. They aren't trying to be "right" or catch the right guy. Someone goes to jail, solved. If you're wrong, let t

Silhouette May 16, 2018 View on HN

Police add yet another massively inaccurate tool to their arsenal which they can easily put innocent people behind bars.In what way can this possibly put innocent people behind bars, never mind easily? The article specifically notes that any hits on the system are then checked by real police officers before any further action is taken, and from that point surely the same processes and controls will apply as if an officer thought they'd recognised a person of interest as part of th

shadowgovt Mar 7, 2020 View on HN

If it changes the phenomena that trigger false-positives of suspicion towards "was actually present near the scene of the crime" and away from "wrong skin color in the neighborhood," it can be argued the tech has improved people's lives.

air7 Jun 25, 2020 View on HN

The problem is not with the technology, but with how it's used. A medical test is also not 100% error-proof which is why a professional needs to interpret the results, sometimes conducting other tests or disregarding it completely.A cop stopping someone that has a resemblance to a criminal for questioning seems like a good thing to me, as long as the cop knows that there's a reasonable chance it's the wrong guy.

tippytippytango May 25, 2025 View on HN

False positives for this tech is absurdly high and law enforcement treats it like it’s perfect. That’s enough of a reason to make it illegal.