HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » Latest Breaking News (Forum) » Facial recognition softwa...

Tue Aug 13, 2019, 07:00 PM

Facial recognition software mistook 1 in 5 California lawmakers for criminals, says ACLU

Source: LA Times

California Assemblyman Phil Ting has never been arrested, but he was recently mistaken for a criminal.

He's not surprised.

Ting (D-San Francisco), who authored a bill to ban facial recognition software from being used on police body cameras, was one of 26 California legislators who was incorrectly matched with a mug shot in a recent test of a common face-scanning program by the American Civil Liberties Union.

About 1 in 5 legislators was erroneously matched to a person who had been arrested when the ACLU used the software to screen their pictures against a database of 25,000 publicly available booking photos. Last year, in a similar experiment done with photos of members of Congress, the software erroneously matched 28 federal legislators with mug shots.

Read more: https://www.msn.com/en-us/news/technology/facial-recognition-software-mistook-1-in-5-california-lawmakers-for-criminals-says-aclu/ar-AAFKmZA?li=BBnbfcL

10 replies, 849 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 10 replies Author Time Post
Reply Facial recognition software mistook 1 in 5 California lawmakers for criminals, says ACLU (Original post)
Yo_Mama_Been_Loggin Tuesday OP
R Merm Tuesday #1
flor-de-jasmim Tuesday #2
Mike_DuBois Tuesday #3
BootinUp Tuesday #4
marble falls Tuesday #5
bluedigger Tuesday #6
keithbvadu2 Tuesday #7
EarthFirst Tuesday #8
Toorich Wednesday #9
EX500rider Wednesday #10

Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:04 PM

1. If Nunes was one of the five,

then I would have to say it was accurate.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:06 PM

2. So they are face + character recognizers!

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:10 PM

3. But was it wrong?

Just sayin'...

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:18 PM

4. So only 20% accurate then...drum hit!

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:36 PM

5. They were all Republicans, so it can accurately recognize a criminal, just not specific ones.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 07:39 PM

6. I'd have set the over/under at 2/5.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 09:53 PM

7. It identified the other 4 correctly as criminals?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Tue Aug 13, 2019, 11:00 PM

8. Such a fantastic headline...

What horrific implications; however...

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Wed Aug 14, 2019, 09:47 AM

9. Judging by the surnames and apparent skin tones...

... it seems that over 50% of the mis-identifications of the politicians
by the system involved nonwhites. Hummmmmm.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Yo_Mama_Been_Loggin (Original post)

Wed Aug 14, 2019, 05:55 PM

10. I'd have to know what percent confidence threshold they had it set at.

Amazon said it could not immediately comment on the most recent ACLU test, but has previously disputed that the Rekognition software was unreliable, questioning the group's methods of scanning members of Congress. In its developer guide, Amazon recommends using a 99 percent confidence threshold when matching faces, and criticized the ACLU for using a lesser bar the factory setting for the software, according to Matt Cagle, an attorney with the Northern California chapter of the ACLU when testing it.
If it was set at 80% then it performed as it should.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread