First Step Act Marks Progress, But With Biased Algorithms

President Trump with Paul Cell of the International Association of Chiefs of Police at the signing of the First Step Act in November ( Image )

President Trump with Paul Cell of the International Association of Chiefs of Police at the signing of the First Step Act in November (Image)

 

Two weeks ago, at a Martin Luther King Jr. Day event in New York City, Representative Alexandria Ocasio-Cortez (D-NY) made headlines after arguing that algorithms can be racist. She was right.

“Algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions,” Ocasio-Cortez said. “They’re just automated assumptions. And if you don’t fix the bias, then you are just automating the bias.”

In other words, algorithms are created by humans and consume human data, and humans are biased. As the criminal legal system perpetuates institutional racism instead of remedying it, algorithms are appealing because they seem to be impartial and singularly rooted in math and science. This illusion of impartiality is why algorithms have propagated as alternatives to otherwise human-made decisions, from policing to pre-trial bail to sentencing — decisions about who ought to be free and who ought not to be. However, when unmonitored, algorithms simply pick up where most police officers and judges left off and continue acting with bias against low-income people of color.

Despite the current lack of accountability and transparency concerning algorithms and the decisions that they make, they have been central to recent criminal justice reform. On December 21, 2018, President Donald Trump signed the Formerly Incarcerated Reenter Society Transformed Safely Transitioning Every Person Act (First Step Act), a break in the impasse on sorely needed criminal justice reform. The First Step Act aims to reduce time spent in prison through a risk and needs assessment system and an evidence-based recidivism reduction program. These provisions will enable some people who are incarcerated to earn “good time credits” towards early release — but not all.

The risk and needs assessment system uses algorithms to determine who can use these credits towards early release by evaluating each prisoner’s recidivism risk. Simply put, this system bases time spent in prison on crimes that have not been committed yet. Like most features of the criminal legal system, it is likely to disproportionately impact low-income people of color.

An algorithm that prevents a prisoner from using these credits because of previous criminal history or predicted recidivism risk fails to consider that low-income people of color, particularly low-income black people, are more likely to be incarcerated though they are not more likely to commit crime. People of color make up 67% of the prison population and one in three black men will be imprisoned in their lifetime, but there is no noticeable difference in the rates at which different races break the law. For example, black people and white people use drugs at similar rates, but black people are imprisoned at nearly six times the rate that white Americans are for committing nonviolent drug offenses that make up 49 percent of the convictions in our federal prison system. That is not to say that anyone should be incarcerated for a nonviolent drug offense, but, by nature, algorithms cannot fully account for these racial disparities in the criminal legal system. They can, however, make statistical generalizations rather than considering individual circumstances.

Beyond just making generalizations, a 2016 ProPublica study found that risk assessment tools in Florida wrongly evaluated black people twice as often as white people. Furthermore, the tools were wildly unreliable — only 20 percent of those predicted to commit violent crimes again actually did. A 2018 study conducted by researchers at Dartmouth College found that most commonly used risk assessment tools are “no more accurate or fair than predictions made by people with little or no criminal justice expertise.” Never mind that these tools are developed by for-profit companies whose profit streams rely on the maintenance of a criminal legal system so flawed that it requires such software in the first place.

Despite its inherent biases, the First Step Act’s risk and needs assessment system is a cornerstone for achieving the act’s aim of reducing time spent in prison. Around the nation, similar programs have been implemented. Although such sentencing reforms are essential and estimated to free thousands of unnecessarily incarcerated individuals, it is equally essential to remain conscious of the individuals and whole communities falling through the cracks.

As long as institutional racism exists, algorithms will continue perpetuating it, both inside and outside of the criminal legal system — unless critical, thoughtful humans counteract these biases. Better yet, we might question common societal assumptions about crime and prison, and why incarceration is even equated with justice.

 
OpinionKalley HuangComment