Derek Chauvin had done it before.
Stopping the next Derek Chauvin: Minneapolis to invest in software to flag problem cops
Minneapolis wants to spend $1.25 million on new technology to help identify warning signs of bad policing. But the city has tried this before and failed.
Three years before encountering George Floyd in front of Cup Foods, the veteran officer struck 14-year-old John Pope with a flashlight and pinned him down by the neck for 20 minutes. That same year, Chauvin pressed his knee into Zoya Code's throat, in what a recent lawsuit from Code calls Chauvin's "signature move."
Despite this pattern of violent policing, Minneapolis police kept Chauvin on the streets — and even entrusted him with training rookies in the field.
Now, Minneapolis is looking to invest in new software designed to raise red flags at the first signs of officers displaying patterns of dangerous conduct. With help from a Pohlad Foundation grant , the city will spend a projected $1.25 million over the next five years to purchase and maintain a new "early intervention system:" data-collection technology that gives police the opportunity to address conduct before it escalates to a catastrophic incident.
Early warning systems are not new; American law enforcement agencies have been using them in some form for 40 years. But with evolving technology, the latest generation of developers is selling automated software far more advanced than the pen-and-paper systems of the '80s and '90s. And they come with big promises for city governments looking to avoid costly lawsuits or facing crises over accountability.
Ron Huberman, CEO of Chicago-based Benchmark Analytics, calls his product "the Holy Grail of police reform" for its data-driven approach to addressing police conduct. Benchmark launched in 2017 and is now one of the leaders in the industry. Its website calls its product a "revolutionary, all-in-one solution" in a time when "policing in America is at a crossroads."
Some police agencies have adopted Benchmark's system to help boost public trust, such as the one in Harvey, Ill., which is trying to repair its reputation after a corruption scandal led to an FBI raid on the police station.
Yet even the most cutting-edge technology isn't capable of exterminating police misconduct entirely, said Seth Stoughton, who has studied early warning technology as a professor at the University of South Carolina School of Law.
"[They] are not Minority Report," he said, a reference to the Philip K. Dick story about mutants who see crime before it occurs. "They're not predicting with complete certainty that this officer's going to do something wrong in the future."
And Minneapolis has tried and failed to effectively implement similar systems over the decade preceding Chauvin's murder of Floyd, which reveals the technology's greatest weakness: the humans using it.
Red flags to 'save careers'
For some U.S. police departments, early intervention technology has become mandatory.
After a yearlong Justice Department investigation found a pattern of dangerous and racist behavior in the Chicago Police Department, the city signed a consent decree agreeing to a series of wholesale reforms. The decree mandated that Chicago police leadership adopt early intervention technology, which "enables them to proactively identify at-risk behavior by officers under their command."
The University of Chicago has helped develop a new system to follow the prescriptions of the agreement, such as providing police brass with a data dashboard of common metrics and allowing for "peer group analysis" to ferret out problem units within the police force.
Minneapolis could soon face a similar directive. The city is negotiating a consent decree with state human rights officials over illegal policing and is anticipating another from the Justice Department in coming months.
In recent months, police and IT officials have been meeting to develop desired features for an early intervention system, and the city plans to send out a request for proposals in the next couple months, said Minneapolis Police Commander Chris Granger.
Granger said the department wants a system capable of deep data analysis, that can pull from the city's disparate computer systems and offer comparative analysis of one officer against another. They also want a system that can learn, with enough nuance to identify priority cases of officers "truly at risk of poor performance and in need of intervention," and store data on past incidents from initial alert to resolution.
Each intervention system is different, but a typical one will weigh factors like excessive use of force — a head strike or a few punches or kicks in a short window of time may trigger a red flag — disciplinary action, alcohol or other substance-related incidents, and citizen-generated complaints.
Huberman, of Benchmark, said a small percentage of police officers account for 87% of "major adverse investigations." The chances of such an incident grow exponentially when high-risk cops work together. The system can alert police supervisors early enough to intervene through focused training, staggering officers' shifts or other "non-disciplinary" means that can save money in the long run.
Some developers call their products officer-support systems, which also measure "compassion fatigue" or other mental health troubles in police agencies.
Vector Solutions-Acadis, a company based in Indiana, advertises its early intervention technology for its ability to also flag positive behavior for police leadership, which "ignites motivation and continued success," according to the website.
Rather than a component of internal affairs, the system's goal is "saving careers," said Paul Boulware, a retired police commander who now works as project consultant for the company's Guardian Tracking software.
Vector counts 1,400 clients across the country, including EMS, fire and 911 centers, in all 50 states and in Canada. Boulware said the software is designed to help police create a "collective pool of knowledge" for leadership to better manage officers. Features also include officer-to-officer compliments, and flags to officers who may need counseling after responding to difficult calls.
But even with "all the tools in the world," he said, "if there's not people actively engaged inside the department, it's not going to work."
Previous efforts 'died on the vine'
Some police departments see downsides to early intervention technology, said Stoughton.
Creating a paper trail to problem officers could become a liability if the officer ends up in a lawsuit, because it may give the appearance that police didn't do enough to stop it, he said.
A police department's culture may also reject the technology if officers — especially at the top — see it as a punitive digital babysitter.
Stoughton compared early intervention technology to body-worn cameras. If the officers don't turn them on, they don't make a difference. In the same way, failing to accurately and consistently add data is one example of how human error can distort the results and thwart a system's effectiveness, said Stoughton.
"Even assuming you have perfect input, there is a human factor of: what do you do with the output?" he said. "Because you can't over-rely on it, you can't under-rely on it. And that's not a problem technology can solve."
The human component is where Minneapolis has failed in the past.
In 2015, under the leadership of Chief Janeé Harteau, a Justice Department audit found significant "gaps" in an early intervention system the city had installed six years earlier.
Police staff didn't buy into the system or agree on what constituted "problematic behavior," according to the Justice Department report. They also perceived it as a human resources-type "officer wellness" program, rather than a tool for accountability and risk management. And the "lack of automation" prevented electronic flagging of behaviors of concern in a systematic manner.
"MPD should develop a new, prevention-oriented EIS that incorporates broad stakeholder input, improves officer performance, manages risks, provides a continuum of interventions and is supported by an automated information system," the Justice Department audit concluded.
Afterward, the city created a steering committee comprising police leaders, a city attorney, police union officials, city staff and members of the public to devise parameters of a more effective system. The committee determined the system should track use-of-force factors like punches, kicks and police-dog bites, along with lawsuits and complaints to internal affairs or the Office of Police Conduct Review, according to meeting minutes from 2017. It would also track positive performance metrics, like letters or community feedback, media stories and performance reviews.
David Bicking, a member of the steering committee, said he's "virtually certain" the system they devised would have flagged Chauvin, who had at least 16 misconduct complaints filed against him.
But the city never implemented it — at least not the way they recommended it, said Bicking. "Just like so many things, the ball was dropped. It died on the vine."
Though the police department told human rights investigators that it had invested in developing early intervention technology, the state found the system to be "non-existent," according findings released in April.
The human rights investigation found lack of early intervention technology prevented police from identifying officers who needed support, such as one who said he was "paranoid" of Black men.
One high-ranking police official described the early intervention system as a mere "spreadsheet." Another said they were unaware one still existed.
Yet another described the MPD's early intervention system "as 'a unicorn,' indicating that it was imaginary."
Democratic Gov. Tim Walz held up Minnesota as an example to follow during his first and only debate with Republican Sen. JD Vance of Ohio.