AI Could Turn the Police State Into a Nightmare

Mason Mohon | @mohonofficial

In the 2002 Steven Spielberg film Minority Report, a special police unit prevents crimes before they happen based on the premonitions of psychics. Once again, reality imitates art. But instead of psychics, police are using artificial intelligence (AI) to predict future crimes. This won’t decrease crime, though. Rather, it’s a police state nightmare in the making.

The U.S. Prison State

The United States has the highest incarceration rate for any country on Earth. Although it only holds 4.4% of the world’s population, domestic American prisons harbor 22% of the world’s prisoners. This is for a couple of reasons:

  • The United States war on drugs started in 1971 rakes in swaths of nonviolent offenders year-by-year. Since 1980, arrests for drug possession have increased by nearly 200%. On top of that, a disproportionate amount of those convicted for drug crimes are people of color, even though statistics do not show that they use drugs more than white people.
  • Ridiculous financial regulations land many others in prison. Recently, a New Jersey man was indicted simply for trading Bitcoin. Charlie Shrem was imprisoned because Bitcoin he sold was used to buy drugs on the Silk Road. Ross Ulbricht is serving a double life sentence for making a website.
  • 39% of the prison population is in prison either for non-violent crimes or has overstayed a reasonable sentence.

Clearly, the United States has a serious prison problem. We are locking up way too many people, and using AI will not make the problem any better.

Predictive AI

Predictive AI is now on the scene as a potential crime-reduction measure. But rather than reduce crime, it will most likely lead to a nightmarish police state where innocent until proven guilty goes out the window. With this technology, we will live in a world where you cannot even be proven innocent. If the algorithm says you’re guilty, you’re guilty.

This technology could easily lead to certain populations being unfairly targeted because of historical bad enforcement of the law. People of color don’t use drugs more than white people. Regardless, police disproportionately target them for drug crimes. Based on the data that more people of color have been locked up for drug crimes, the AI could make the assumption that they are more likely to be criminals. This is inherently problematic. Racist AI would obviously be a terrible addition to our justice system.

Thankfully, though, experts in artificial intelligence are recognizing this problem. AI experts from MIT, Harvard, Princeton, NYU, UC Berkeley and Columbia shared a letter earlier this month detailing the potential issues with predictive AI. The letter reads:

Today’s pretrial risk assessments are ill-equipped to support judges in evaluating and effectively intervening on these specific risks, because the outcomes that these tools measure do not match the risks that judges are required by law to consider.

The experts realize that crime-predicting AI in its present form is problematic. But technology is rapidly developing, so it’s not hard to imagine a world where a better, more “fleshed-out” actuarial artificial intelligence fills the gap left by current technology. Technology determining who is guilty, who is not, and who will be guilty in the future sets a terrible precedent for the future of justice and the police state. We need to avoid such programs.


71 Republic takes pride in our distinctively independent journalism and editorials. Every dollar you give helps us grow our mission of providing reliable coverage. Please consider donating to our Patreon.