Police departments all over the world are testing algorithms that will help them increase surveillance and highlight known criminals
In Steven Spielberg’s classic 2002 film Minority Report, a specialized police department called the PreCrime unit was able to arrest murderers before they could commit their crimes.
Less than 12 years later, this approach is leaving the realm of science fiction and is rapidly becoming a reality. But don’t worry, nobody is arresting people for crimes they didn’t commit -Yet.
Don’t even think about it
Instead, police departments all over the world are testing algorithms that will help them increase surveillance and highlight known criminals.
In Berlin, the police department is testing a new system called “PreCobs”, a contraction of the term ‘Pre-Crime Observation System’ – and a direct reference to minority report’s system, called “PreCogs”. Developed by the Institute for pattern-based Prediction Technique in Oberhausen, the system analyzes past crime locations to predict areas that are more likely to have a crime occur there in the future, allowing police to increase surveillance.
The system is currently being tested in Munich and Nuremberg, and according to interior minister Joachim Herrman, the results of an earlier test in Zurich, “look promising, 86 percent of the predictions were correct.”
The German system is meant to identify crime locations, but other systems take the analysis back to individual criminals. The London Metropolitan Police Service (MPS) is testing a system that will analyze individuals criminal history and social media activity to predict how likely they are to commit a crime in the future. Developed by Accenture to help predict if gang members would re-commit crimes, the testing phase involved analyzing crime data across the span of four years and seeing if it accurately predicted the crimes of the following fifth year.
In a similar approach, the US states of Baltimore and Philadelphia are testing a platform developed by criminologist Richard Berk, a professor at University of Pennsylvania, to help parole officers decide how much supervision released prisoners will require. Currently, officers must assess a criminal’s personal records and make a decision, and the current machine learning algorithm seeks to make a more accurate assessment of the risk level of individual criminals and help set the amount of bail that should be set. The system, Berk said in an interview, would have “many fewer mistakes than you would be if you used the old procedures”