California’s Prop 25 would replace cash bail with algorithms, but questions around fairness and transparency remain

As a California voter, I know state propositions can be confusing. Despite good intentions, if you don’t read carefully and do your research, you can make decisions you come to regret, and a great example of this is Proposition 25.

A “no” vote on Prop 25 would repeal SB 10, a law passed in August 2018 by the California state legislature that replaces cash bail with an algorithmic assessment. The law was scheduled to take effect in October 2019, but courts stayed implementation after the cash bail industry put Prop 25 on the ballot in an effort to survive.

A “yes” vote upholds SB 10 and puts you in line with Governor Gavin Newsom, Congressperson Karen Bass (D-CA), and the Service Employees International Union (SEIU). A “no” vote puts you in the company of the Human Rights Watch, the ACLU branches of California, and the NAACP, as well as the Republican Party of California.

If Prop 25 passes, California would be the first state in the country to enact such a policy, dealing a death knell to the local cash bail industry. According to a 2011 study, the United States and the Philippines are the only two nations that rely on cash bail to ensure people show up for a criminal trial. Naturally, cash bail puts people with financial means at a considerable advantage.

Efforts to replace a cash bail system with an algorithm are intertwined with broader criminal justice reform efforts. Advocates see algorithms as an impartial option that removes the financial burden of cash bail, which disproportionately impacts poor people. Under SB 10, people arrested for a misdemeanor crime would be released within 12 hours, and only people deemed high risk would be subject to algorithmic assessment. Exceptions to the rule are repeat offenders or people accused of violent crimes.

Because the risk assessment approach eliminates bail money completely, supporters say it can correct social, economic, and financial inequalities. That’s particularly important in California, which has rising rates of income inequality.

People in jail before trial make up a sizable part of the U.S. prison population. A Prison Policy Initiative study released earlier this year found that of the nearly 2.3 million people in jail nationwide, 460,000 are awaiting trial. In California, the rate of people incarcerated before trial runs above the national average.

California courts played a role in reform that led to the passage of SB 10. In her 2016 State of the Judiciary address, Chief Justice Tani Cantil-Sakauye said, “We must not penalize the poor for being poor.” SB 10 was informed by a pretrial detention reform workgroup assembled by Chief Justice Cantil-Sakauye. Following conversations with 40 stakeholder organizations, the group declared cash bail “unsafe and unfair” in 2017 and issued 10 recommendations, including one advocating use of a pretrial risk assessment algorithm. Cantil-Sakauye describes SB 10 as a bill that transforms “an outdated, unfair, and unsafe system into one that not only effectively protects the due process rights of the accused but also public safety.”

While these arguments sound compelling, those opposing Prop 25 fear algorithms will automate oppression and reinforce existing bias. Unfortunately, such concerns are not novel. Racial bias was documented years ago in algorithms that aim to predict recidivism, like COMPAS. And biases have been found in a variety of other algorithms that heavily impact people’s lives — from academic grading and remote testing algorithms to those used in hiring, health care, and lending.

In a worst-case scenario, criminal justice applications of AI could combine tech like facial recognition, predictive policing, and risk assessment tools and lead to a vicious cycle of automated discrimination. More individuals, particularly people of color, would end up in jail not because they pose a credible threat to society or are a flight risk, but because AI trained with dirty data has reinforced and potentially accelerated existing inequities.

Evolving attitudes about pretrial risk assessment analysis

After the passage of SB 10 in summer 2018, AI, tech, and legal experts released analyses of risk assessment algorithms. Among them:

A 2019 analysis of risk assessment tool usage throughout the United States found that nine out of 10 pretrial detention agencies in the country use some form of risk assessment algorithms to guide pretrial detention. The report also found that 28 states use them to guide parole decisions and 20 states use them for sentencing. Overall, the review found no clear evidence that risk assessment tools reduce prison populations or racial inequality in the criminal justice system and cautioned that proving positive social justice outcomes would require additional study.

Earlier this year, the Media Mobilizing Project and MediaJustice collected data about risk assessment tools in virtually every U.S. state. They found that one in three U.S. counties currently uses a pretrial assessment tool, but they could not find evidence that the tools reduce racial disparities due to a lack of access to data.

In April 2019, a collective of nonprofits and tech giants called the Partnership on AI released a report cautioning against the use of risk assessment tools. The policy recommendation was specifically made in response to SB 10 in California and the First Step Act passed by Congress. The report cites validity issues, data sampling bias, and bias in statistical predictions among the reasons for its recommendation.

In July 2019, a diverse group of AI, data, and legal experts concluded that serious technical flaws undermine risk assessment algorithms’ accuracy and validity, due in large part to their use of historical data. In a letter sent to legislators and judicial leaders in California and Missouri, the group wrote, “Risk assessments that incorporate this distorted data will produce distorted results. These problems cannot be resolved with technical fixes. We strongly recommend turning to other reforms.”

In December 2019, a group of tech and policy experts working with Aspen Tech Policy Hub explored ways to successfully implement risk assessment tools in California. Part of the Aspen Institute, the Aspen Tech Policy Hub invites people from tech and government backgrounds to create tech policy solutions using a model akin to a startup accelerator.

The trio found that monitoring pretrial risk assessments will require data from four to five sources, including the sheriffs’ departments that manage county jails, courts, and the National Crime Information Center database. Each of these is maintained independently today, making such data collection an arduous task. The group also found that many counties cannot afford to employ technical experts to maintain the data systems.

The group looked at small and large counties across California and found that no single technical solution would be scalable to all counties. The researchers reported that disconnected systems made data collection for risk assessment tools too time-consuming and expensive for some government agencies to manage. They also found no statewide agreement or comprehensive plan for how algorithmic systems were to be administered.

“There are no standards in place, to our knowledge. This means that the outcomes, implicit bias, successes, and failures cannot be accurately and efficiently tracked statewide. There is no standardized or comprehensive method/tool/system to measure if these systems have been implemented correctly, are being administered responsibly, or if outcomes and risks are being assessed uniformly,” Aspen Tech Policy Hub fellows Allison Day, Anil Dewan, and Karissa McKelvey told VentureBeat in an email.

“Our research led us to believe that this would cause auditing to be under-resourced and inefficient for the majority of counties, leading to inadequate evaluation overall. Without timely and standardized oversight, these pretrial risk assessment tools could increase incarceration rates in some counties, rather than reduce them, as well as perpetuate the biases inherent in the data being used to inform release decisions.”

Assessments for assessment algorithms

Under SB 10, the Judicial Council will have to report its evaluation of risk assessment algorithms to the Governor and state legislature every two years.

About a year ago, lawmakers passed SB 36 in an attempt to put guardrails around the evaluation of risk assessment algorithms in California. Written in part by SB 10 coauthor Sen. Robert Hertzberg, along with the Ella Baker Center for Civil Rights, SB 36 requires the California Judicial Council to publish predictive accuracy data related to race or ethnicity, gender, income level, and offense type, starting in December 2021.

The execution of SB 36 requirements is left to each of California’s 57 counties, but the law requires the Judicial Council to release annual reports documenting data collected for risk assessment algorithms statewide.

SB 10 also requires some Judicial Council reporting, but it leaves serious questions unanswered. For example, SB 10 does not appear to clearly define how pretrial risk assessment algorithms will be assessed for fairness. The critical work of defining rules for pretrial risk assessment is the responsibility of the Judicial Council. The Judicial Council is also charged with creating an approved list of algorithms that county pretrial service departments can use to assess the flight and public safety risk of people accused of crimes.

Before SB 10 was challenged by Prop 25, the Judicial Council had begun  receiving public comment on rules and standards for risk assessment algorithms. But a Judicial Council spokesperson told VentureBeat that the work was put on hold after Prop 25 made its way onto the ballot.

Lingering questions

A survey of risk assessment tools in the U.S. released earlier this year found that virtually every California county already uses a risk assessment algorithm. The survey also found that the state is using virtually every kind of assessment algorithm available — from PSA to the kinds made by pretrial service organizations in Ohio and Virginia.

The 2019 California state budget set aside $75 million and a two-year pilot program for pretrial projects in 16 small, medium, and large counties. As part of that pilot, the Judicial Council is charged with assessing bias and any disparate impact of pretrial programs, but it has not yet shared any detailed data or findings.

SB 36 would require independent, external audits. Such outsourcing of algorithm evaluations is part of a trend developing in private industry and in the governmental sector. Researchers from 30 organizations recently published a paper urging the AI community to create an external audit market as a way to put ethics principles into action. The researchers also advocated the creation of bias bounties, much as the cybersecurity sector offers cash rewards to people who discover bugs in code.

These extra validation steps are reassuring, but the big question Prop 25 leaves me with is how the state plans to define fairness. As we head to the polls today, we don’t know exactly how risk assessment algorithms will be evaluated if Prop 25 passes or what metrics will be used to assess fairness. We don’t know if the state-approved list of acceptable algorithms will force counties to adopt common risk assessment tools and standardize their systems or if the patchwork that exists today will continue, making standardized evaluation elusive.

With the exception of bail businesses, many people seem to agree that that cash bail system is outdated, ineffective, and should end. But given the recent history of algorithmic bias and a much longer history of bias in criminal justice, you don’t have to make up horror stories to understand what could go wrong or why automation makes people apprehensive. There is plenty of evidence that algorithms are capable of automating bias, as Ruha Benjamin details with her concept of a”New Jim Code.”

And algorithms aren’t the only path to reforming the cash bail system. For example, as a result of the pandemic, California courts earlier this year adopted a statewide emergency bail schedule that effectively sets bail for most misdemeanor crimes to $0.

Final thoughts

A few weeks before SB 10 was made law in 2018, more than 100 faith, human rights, legal, and privacy organizations banded together in opposition. In a shared statement, the group argued that risk assessment tools haven’t been proven to slow or curtail racial injustice in the criminal justice system.

Nobody should use them, the group writes, adding that if risk assessment tools are going to be used, they should be subject to meaningful community oversight. The public should see a complete list of testing processes, the risk factors algorithms can consider, and the weights assigned to those factors. And the final risk score should be transparent and open to challenge by defendants and their legal counsel.

The coalition also advocates that data gathered by pretrial service organizations be audited by external data scientists. The concerns behind these recommendations are not new. In the weeks after the passage of SB 10, advocates from the Electronic Frontier Foundation (EFF) criticized California lawmakers for passing a bill without first establishing methods for scrutinizing bias, fairness, and accuracy. The group also concluded that risk assessment tools aren’t ready for public use and cautioned that if they are made law, steps must be taken to mitigate bias.

EFF legislative activist Hayley Tsukayama told VentureBeat that SB 36 addressed some issues raised in 2018. But she said many important terms are still undefined and issues surrounding replicability and transparency remain unanswered.

If Prop 25 passes today and SB 10 becomes law, its implementation will be a work in progress. If voters want real reform, after Election Day they will need to advocate for best practices around transparency, robust public input, and the need to replicate and validate results.


How startups are scaling communication: The pandemic is making startups take a close look at ramping up their communication solutions. Learn how


By VentureBeat Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here