Until now, we have been busy solving crimes after they happened. But what if we could predict them before they happen?
In any case, this is the scenario staged in the movie minority report, released in 2002. As a reminder, the film by Philip K. Dick, played by Tom Cruise, describes a dystopian world where an artificial intelligence is capable of predicting crimes in advance. Good today fiction became reality with the arrival of a new type of artificial intelligence that anticipates crime. The concept is no longer completely new. But the latest news is that an artificial intelligence expert has created an algorithm that can predict where and when a crime was going to happen with 80 to 90% accuracy.
The AI has been tested in 8 major US cities.
The new algorithm has been tested on several US states and the results are particularly encouraging. It was developed by a team led by Professor Ishanu Chattopadhyay, University of Chicago. The researchers designed the AI using crime data in Chicago between the years 2014 and 2016. They then divided the American city into several zones of approximately 300 meters in diameter.
Finally, the algorithm managed anticipate crime levels in each of these geographic areas during the weeks following the analysis. The researchers then extended the tests in seven other major US cities. These are Atlanta, Austin, Detroit, Los Angeles, Philadelphia, Portland, and San Francisco. The results were equally encouraging.
What if AI led to the arrest of innocent people?
However, the efficiency of an AI is not always synonymous with evolution. In fact, in this case, the convincing results do not reassure many observers. On the contrary, many worry about the potential damage such a machine could cause.
“People are concerned that this is being used as a tool to jail people before they commit crimes. It won’t happen because you don’t have the ability to do it. »
For example, the Chicago police have already tested an Artificial Intelligence of the same category some years ago. The authorities have used the tool to try to identify suspects and possible victims in a shooting. The results were particularly disappointing as they were clearly discriminatory. 56% of black men in the city between 20 and 29 years old were on the list generated by the algorithm.
LOOK ALSO: Magnificent painting of an AI that wins a fine arts contest is controversial
A political-social instrument to reduce crime levels
Ishanu Chattopadhyay, lead author of the research published in Nature Human Behavior, wanted to reassure the public about its creation. The latter would not be intended for a police use. I’d rather be a political-social tool that should be used to predict and reduce crime levels in specific geographic areas.
“It simply predicts an event at a specific location. It doesn’t say who will commit the event or the exact dynamics or mechanics of the events. It cannot be used in the same way as in the movie Minority Report. »
Risks of discrimination or racism?
He also claimed to have taken precautions to avoid cases of racism or discrimination of the algorithm.
“My teammates and I have said a lot that we don’t want this to be used as a purely predictive policy tool. We want policy optimization to be the main use. algorithm. We want the mayor or administrators to use the generated model to run simulations and inform policy.”
In any case, we still know little about how this technology will be applied in the real world. The tool is so far in a testing phase.