Opinion editor’s note: Star Tribune Opinion publishes a mix of national and local commentaries online and in print each day. To contribute, click here.
•••
Algorithms are everywhere. Students use ChatGPT to write essays. TikTok bases content on user interests. But algorithms don’t only find cute cat videos — they can seriously harm people and perpetuate systemic inequities. Algorithms often make bad or misleading predictions because they’re based on other people’s data. The Minnesota Pretrial Risk Assessment Tool (MNPAT) is an algorithm that Minnesota’s criminal courts have been using since 2018. Due to flaws in its design, Minnesota’s jails are unnecessarily full of people awaiting trial. People who are supposed to be treated as innocent until proven guilty.
After someone has been arrested and booked into a jail in Minnesota, the MNPAT generates a risk score for that person based on a variety of personal characteristics. Judges then rely on that score to determine conditions of release and a monetary bail amount. The risk score is supposed to estimate the likelihood that someone charged with a crime will (1) be rearrested before trial or (2) fail to appear in court before trial. But the algorithm doesn’t accurately predict either of those things.
In fact, the Minnesota judicial branch itself found that the MNPAT used from 2018 through 2023 was not sufficiently predictive of pretrial failure — meaning that it was not good enough at predicting whether someone would get rearrested or fail to appear before trial — particularly for people categorized as higher risk. In other words, it wasn’t very good at doing what it was designed to do. The Minnesota judicial branch also found that the MNPAT had “negligible” predictiveness for Black and Native American people in Minnesota. For almost six years, Minnesota criminal courts used an inaccurate and discriminatory risk assessment tool.
In response to these issues, the courts approved a new version of the MNPAT that went into effect in January 2024. The new version of the MNPAT is supposed to minimize the racial and socioeconomic bias that plagued the earlier version. But the algorithm still assigns higher risk scores to people who are unemployed or underemployed. This penalizes people in poverty for factors largely outside of their control — factors that people with more resources would not be penalized for.
The MNPAT also still suffers from the “garbage in, garbage out” problem: When an algorithm is developed using biased data, the data it produces is also biased. Because judges and prosecutors think of the MNPAT as “objective,” they allow it to influence whether a person will get out of jail. But the systemic and individual racism in policing, prosecution, housing and employment throughout Minnesota’s history mean that the MNPAT is racially biased.
Moreover, the new tool does nothing to address the larger issues with risk assessments generally. Risk assessments work by taking information about the accused person and comparing it with data about groups of people to determine whether the accused person is more like Group A or Group B. But making predictions about the behavior of an individual based on group generalizations is both difficult and a violation of every Minnesotan’s right to due process under our Constitution.