Sentencing Software

  • Uploaded by: John Philip Monera
  • 0
  • 0
  • February 2021
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Sentencing Software as PDF for free.

More details

  • Words: 353
  • Pages: 23
Loading documents preview...
SENTENCING SOFTWARE JOHN KAYLEE JIMENEZ JOHN PHILIP MONERA

LEONIL OCANA

SENTENCING SOFTWARE SOFTWARE USED BY JUDGES IN SENTENCING HEARINGS TO CREATE RECIDIVISM.

COMPAS -CORRECTIONAL OFFENDER MANAGEMENT PROFILING FOR ALTERNATIVE SANCTIONS

SOME OF ETHICAL DILEMMAS •

In May, the investigative news organization ProPublica claimed that COMPAS is biased against black defendants. Northpointe, the Michigan-based company that created the tool, released its own report questioning ProPublica’s analysis. ProPublica rebutted the rebuttal, academic researchers entered the fray, this newspaper’s Wonkblog weighed in, and even the Wisconsin Supreme Court cited the controversy in its recent ruling that upheld the use of COMPAS in sentencing.

• ProPublica points out that among defendants who ultimately did not reoffend, blacks were more than twice as likely as whites to be classified as medium or high risk (42 percent vs. 22 percent). Even though these defendants did not go on to commit a crime, they are nonetheless subjected to harsher treatment by the courts. ProPublica argues that a fair algorithm cannot make these serious errors more frequently for one race group than for another.

DISTRIBUTION OF DEFENDANTS ACROSS RISK CATEGORIES BY RACE. BLACK DEFENDANTS REOFFENDED AT A HIGHER RATE THAN WHITES, AND ACCORDINGLY, A HIGHER PROPORTION OF BLACK DEFENDANTS ARE DEEMED MEDIUM OR HIGH RISK. AS A RESULT, BLACKS WHO DO NOT REOFFEND ARE ALSO MORE LIKELY TO BE CLASSIFIED HIGHER RISK THAN WHITES WHO DO NOT REOFFEND.



using AI in investigations and sentencing could potentially help save time and money



COMPAS’s algorithms overall have been found to be no more effective than human decision. A study conducted by a researcher at Dartmouth College determined that humans were able to accurately predict whether a criminal reoffended just as well as COMPAS. Another study by Rutgers University reaffirmed the notably low success rates of COMPAS, especially when predicting the likelihood of someone who committed a violet crime to reoffend.

• COMPAS involves inputting the answers to over 100 questions about a person’s history from a variety of subjects, including offenses, family, and even social life.



means to guide courts in their sentencing.

HTTPS://GCN.COM/ARTICLES/2018/01/18/RECID IVISM-PREDICTION-SOFTWARE-FLAWS.ASPX • http://si410wiki.sites.uofmhosting.net/index.php/Criminal_sentencing_softwar e

• https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/canan-algorithm-be-racist-our-analysis-is-more-cautious-thanpropublicas/?noredirect=on

Related Documents

Sentencing Software
February 2021 0
Software Atozmath
January 2021 1
Software Contable
February 2021 1
Software Libre
February 2021 1
Prueba Software
February 2021 1
Software Reiki Hypnosis
January 2021 1

More Documents from "Ken Vda"