CrimsonPhantom
CUSA Curator
Posts: 42,044
Joined: Mar 2013
Reputation: 2401
I Root For: NM State
Location:
|
Facial Recognition Software Leads to Mistaken Arrest of Georgia Man
Quote:Anyone who has ever seen a Hollywood dystopian action flick knows that the more automated life becomes, the closer we are to huddling in giant, underground cities while an elite few humans battle the machines for dominance of the planet.
When it comes to law enforcement activities, the use of artificial intelligence takes on a particularly sinister feel. San Francisco recently suspended its killer robot program after Americans discovered it has an actual killer robot program. Robotic aids, not unlike those sent in to examine and contain bombs, had been approved for use against violent suspects in some “extreme situations.” The thought of sending a robot in to assess and react to a very human situation seems like a very good way to make some very bad mistakes. Machines are not infallible, after all, and they cannot replicate the nuances and complicated processes of human judgment and interaction.
One Georgia man recently discovered the dark side of facial recognition technology when he was arrested on a warrant from Louisiana. Randall Reid, 28, was picked up in DeKalb County, Georgia, last November. Authorities had connected him to a string of purse thefts in Jefferson Parish and Baton Rouge, Louisiana.
Randall insisted he’d never been to Louisiana in his life, and didn’t even know what “Jefferson Parish” was. He couldn’t have done it. The problem is, the computer said he did.
“They told me I had a warrant out of Jefferson Parish. I said, ‘What is Jefferson Parish?’” Reid said. “I have never been to Louisiana a day in my life. Then they told me it was for theft. So not only have I not been to Louisiana, I also don’t steal.”
Facial recognition software connected surveillance images to Reid’s Georgia identification records, and an arrest warrant was issued by Baton Rouge authorities. Georgia authorities executed the warrant and jailed the Georgia man.
Reid was later released after authorities noticed significant discrepancies between the two men. Reid had a mole on his face, while the suspect did not. There was also at least a forty pound difference between the men. The one feature they had in common was that they are both black.
That fact has renewed fears around the inherent danger in relying on technology in law enforcement situations. Facial recognition software is known to misidentify black people and other minority groups at a much higher rate than white people.
An MIT study of three commercial gender-recognition systems found they had errors rates of up to 34% for dark-skinned women — a rate nearly 49 times that for white men.
A Commerce Department study late last year showed similar findings. Looking at instances in which an algorithm wrongly identified two different people as the same person, the study found that error rates for African men and women were two orders of magnitude higher than for Eastern Europeans, who showed the lowest rates.
It isn’t just the racial vulnerabilities that make the technology creepy. It’s just creepy. Recently, a New York City lawyer was turned away from a holiday Rockettes performance after facial recognition technology in the lobby identified her as a lawyer for the firm that is involved in bringing lawsuits against the parent entertainment company, Madison Square Garden Entertainment. The woman was only accompanying her daughter’s Girl Scout troop for a day a fun, but had to leave the group after being flagged by the software.
One might say the show had the right to reject people who could be there gathering information to bolster a law suit, but using facial recognition software to do it creates more problems than it solves, particularly when it comes to privacy.
China already uses facial recognition to crack down on citizens in nearly every aspect of life. That thought alone should be enough to give any American pause when it comes to the technology.
Link
|
|
01-03-2023 01:56 PM |
|
UofMstateU
Legend
Posts: 39,267
Joined: Dec 2009
Reputation: 3586
I Root For: Memphis
Location:
|
RE: Facial Recognition Software Leads to Mistaken Arrest of Georgia Man
Its all based on how the models for their ai are built. I can make their AI software recognize a steaming pile of crap as being Joe Biden*.
|
|
01-03-2023 02:05 PM |
|
Hambone10
Hooter
Posts: 40,342
Joined: Nov 2005
Reputation: 1293
I Root For: My Kids
Location: Right Down th Middle
|
RE: Facial Recognition Software Leads to Mistaken Arrest of Georgia Man
Quote:An MIT study of three commercial gender-recognition systems found they had errors rates of up to 34% for dark-skinned women — a rate nearly 49 times that for white men.
Is there something about darker skin tones that is more problematic for AI?? Especially if combined with make-up??
Or is there something inherently racist in the code? I know some will argue the latter, but if that is the case it should be fairly easily vetted and removed.
The idea should be not to get more white men wrong, but to get more black women right.
It is truly amazing what make-up and techniques can do these days to create false images where what the eye sees (perhaps deep facial cuts and sharp features) and what the AI detects (the actual more curved and soft features) can really be scary. Just look at some extreme before/after make-up videos.
|
|
01-03-2023 03:35 PM |
|
UofMstateU
Legend
Posts: 39,267
Joined: Dec 2009
Reputation: 3586
I Root For: Memphis
Location:
|
RE: Facial Recognition Software Leads to Mistaken Arrest of Georgia Man
(01-03-2023 01:56 PM)CrimsonPhantom Wrote: Randall insisted he’d never been to Louisiana in his life, and didn’t even know what “Jefferson Parish” was. He couldn’t have done it. The problem is, the computer said he did.
This is a VERY bad look if law enforcement believes this is what the computer did. In AI and recognition, the computer is NEVER 100% certain of any match, therefore the conputer never says "this is the guy". The computer says "this guy is about a XX% match to this other guy.
They should NEVER be arresting anyone on a potential AI match. They should first double verify if the match could even be legit or not. For instance, one of these guys has a mole on his face, and the other doesnt. AI will match the person with or without a mole, because that one feature wont disqualify it as a match. It will lower the probabilty of a match. So what law enforcement should have done is to look at the surveillance video in LA and compare it to the ID photos in GA, and they would seen one has a mole and one doesnt, and that one guy is 40 pounds heavier than the other, and then you quickly build up enough evidence to say its not a match.
What I would like to know was what was the computers probably of it being a match? Was it 99.9%. (That would mean their software is junk) Was it 80%? Or was it 50%. Or was it lower? We may find out its not the software who's has an issue with race, but whoevers job it was to spot verify the photo to the surveillance video.
|
|
01-03-2023 04:30 PM |
|
Native Georgian
Legend
Posts: 27,619
Joined: May 2008
Reputation: 1042
I Root For: TULANE+GA.STATE
Location: Decatur GA
|
RE: Facial Recognition Software Leads to Mistaken Arrest of Georgia Man
(01-03-2023 04:30 PM)UofMstateU Wrote: We may find out it’s not the software who has an issue with race, but whoever's job it was to spot-verify the photo to the surveillance video.
Excellent point. And a related point is, was that verification process done by LEOs in DeKalb County Ga, or Louisiana, or both?
I don’t know the exact %s, but law enforcement in DeKalb County, GA. is predominately Black.
Louisiana, it depends on the exact jurisdiction.
|
|
01-03-2023 06:06 PM |
|