Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsOver 1,000 AI Experts Condemn Racist Algorithms That Claim to Predict Crime
Technologists from MIT, Harvard, and Google say research claiming to predict crime based on human faces creates a "tech-to-prison pipeline" that reinforces racist policing.https://www.vice.com/en_us/article/889xyb/over-1000-ai-experts-condemn-racist-algorithms-that-claim-to-predict-crime
Over 1,000 technologists and scholars are speaking out against algorithms that attempt to predict crime based solely on a persons face, saying that publishing such studies reinforces pre-existing racial bias in the criminal justice system. The public letter has been signed by academics and AI experts from Harvard, MIT, Google, and Microsoft, and calls on the publishing company Springer to halt the publication of an upcoming paper. The paper describes a system that the authors claim can predict whether someone will commit a crime based solely on a picture of their face, with 80 percent accuracy and no racial bias.
There is simply no way to develop a system that can predict criminality that is not racially biased, because criminal justice data is inherently racist, wrote Audrey Beard, one of the letter's organizers, in an emailed statement. The letter calls on Springer to retract the paper from publication in Springer Nature, release a statement condemning the use of these methods, and commit to not publish similar studies in the future.
This is not the first time AI researchers have made these dubious claims. Machine learning researchers roundly condemned a similar paper released in 2017, whose authors claimed the ability to predict future criminal behavior by training an algorithm with the faces of people previously convicted of crimes. As experts noted at the time, this merely creates a feedback loop that justifies further targeting of marginalized groups that are already disproportionately policed. As numerous scholars have demonstrated, historical court and arrest data reflect the policies and practices of the criminal justice system, the letter states. These data reflect who police choose to arrest, how judges choose to rule, and which people are granted longer or more lenient sentences [...] Thus, any software built within the existing criminal legal framework will inevitably echo those same prejudices and fundamental inaccuracies when it comes to determining if a person has the face of a criminal.
The letter is being released as protests against systemic racism and police violence continue across the US, following the deaths of Breonna Taylor, George Floyd, Tony McDade, and other Black people killed by police. The technologists describe these biased algorithms as part of a tech-to-prison pipeline, which enables law enforcement to justify discrimination and violence against marginalized communities behind the veneer of objective algorithmic systems. The worldwide uprisings have revived scrutiny of algorithmic policing technologies such as facial recognition. Earlier this month, IBM announced it would no longer develop or sell facial recognition systems for use by law enforcement. Amazon followed by putting a one year moratorium on police use of its own facial recognition system, Rekognition. Motherboard asked an additional 45 companies whether they would stop selling the technology to cops, and received mostly non-responses.
snip
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
6 replies, 1009 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (4)
ReplyReply to this post
6 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Over 1,000 AI Experts Condemn Racist Algorithms That Claim to Predict Crime (Original Post)
Celerity
Jun 2020
OP
The Magistrate
(95,247 posts)1. This Sounds Like Lombroso Reborn, Ma'am
The idea that a 'criminal type' could be detected by measurement of skull shape and facial features and the like. Nonsesense than and nonesense now, though one might plead ignorance for the nonesense of the 19th century. That is not possible today.
Midnight Writer
(21,751 posts)2. A single Roger Stone may skew the phrenology data for decades.
2naSalit
(86,577 posts)4. LOL!
No doubt about it!
delisen
(6,042 posts)3. Pseudo-science
2naSalit
(86,577 posts)5. Eugenics of the 21st century. ...nt
appalachiablue
(41,131 posts)6. 'A tech-to-prison pipeline'
Brave new dystopia