An artificial intelligence sifting through crime data can predict the location of crimes in the coming week with up to 90 percent accuracy, but there are concerns about how systems like this could perpetuate bias
June 30, 2022
An artificial intelligence can now predict the location and crime rate in a city a week in advance with an accuracy of up to 90 percent. Similar systems have been shown to persist racist bias in the policeand the same could be true in this case, but the researchers who created this AI argue that it can also be used to expose those biases.
Ishanu Chattopadhyay at the University of Chicago and colleagues created an AI model that analyzed historical crime data from Chicago, Illinois, from 2014 to the end of 2016, then predicted crime levels for the weeks following this training period.
The model predicted the probability of certain crimes in the city, which was divided into squares about 300 meters wide a week in advance, with an accuracy of up to 90 percent. It was also trained and tested on data for seven other major US cities, with a similar level of performance.
Previous attempts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, the Chicago Police Department has been testing an algorithm that created a list of people who like the greatest risk of getting involved in a shootingeither as a victim or as an offender. Details of the algorithm and the list was initially kept secret, but when the list was finally released, it turned out that 56 percent of black men in the city between 20 and 29 years it said.
Chattopadhyay admits the data used by his model is also biased, but says efforts have been made to reduce the effect of bias and the AI does not identify suspects, only potential crime scenes. “It’s not Minority Report,” he says.
“The resources for law enforcement are not infinite. So you want to make the most of it. It would be great if you knew where murders will take place,” he says.
Chattopadhyay says the AI’s predictions could more safely be used to inform high-level policymakers, rather than being used directly to allocate police resources. He has made the data and algorithm used in the study public so that other researchers can examine the results.
The researchers also used the data to look for areas where human bias influences policing. They analyzed the number of arrests after crimes in Chicago neighborhoods of different socioeconomic levels. This found that crimes committed in wealthier areas resulted in more arrests than in poorer areas, suggesting bias in police response.
Lawrence Sherman of the Cambridge Center for Evidence-Based Policing, UK, says he is concerned about the inclusion of reactive and proactive police data in the investigation, whether crimes are recorded because people report them and crimes recorded because the police are looking for them. Search. The latter type of data is very prone to bias, he says. “It may reflect deliberate discrimination by the police in certain areas,” he says.
Reference magazine: Nature Human behaviorDOI: 10.1038/s41562-022-01372-0
More on these topics: