Gero - Blog Post 3

        In “Are Algorithms Value-Free,” Professor Johnson discusses how algorithms are never value-free and that assuming their objectivity has real consequences. For example, the COMPAS program (Correctional Offender Management Profiling for Alternative Sanctions) is used by judges to assess bail decisions and recidivism risk (Johnson, page 14). Each defendant is assigned a risk score that often sways the judge’s ruling (Johnson, page 17). COMPAS has been regarded as biased against African Americans because it “was almost twice as likely to falsely label black defendants as future criminals than white defendants, while often mislabeling white defendants as low risk at a higher rate than black defendants,” (Johnson, page 14). Professor Johnson suggests that instead of aiming for a value-free, “fair” algorithm that has produced biased results, it is essential to incorporate value-laden considerations (Johnson, page 16). 

            These value-laden considerations relate to our discussion of Cheryl Harris’s argument on Affirmative Action in “Whiteness as Property.” Harris claims that Affirmative Action is necessary because “neutrality” of admissions decisions is dependent on notions of colorblindness, like what was used in Bakke’s argument in Regents of the University of California v. Bakke (Harris, page 1771). Harris earlier asserted that embracing the norm of colorblindness is actually protecting the property interest in whiteness, by which “race is a color and color does not matter,” (Harris, page 1768). Upholding colorblindness as the ideal is problematic because, the words of Neil Gotanda, “color blindness is a form of race subordination in that it denies the historical context of white domination and Black subjugation,” (Harris, page 1768). 

            In applying Harris’s critique of criticism to the COMPAS program, Harris would suggest that these value-laden considerations actively account for race instead of trying to eliminate it as a factor in assigning a risk score. This may manifest as automatically lowering the risk score assigned to Black defendants to account for algorithmic bias. Determining exactly how much the score should be lowered presents another issue in itself, although there may be other ways for COMPAS to account for race. It is especially important to recognize and apply Harris’s argument in the context of the criminal justice system, given the past and current detriment it inflicts on Black communities. The goal should not be to eradicate race as a factor, but to understand and account for algorithmic and human cognitive biases. 

 

Comments

Popular posts from this blog

Gero - Final Farewell Blog Post Fifteen

Mehra - Blog Post "Lucky Number 13"

Discussion Leader Sign Up