Tom Cruise using a predictive policing system in 'Minority Report'
This is
not science fiction. The NYPD
and the Miami police department have now contracted out a company named
HunchLab to help them institute what they call "predictive policing intelligence." On the HunchLab website, they describe their service like this:
HunchLab is a web-based predictive policing system. Advanced statistical models automatically include concepts such as aoristic temporal analysis, seasonality, risk terrain modeling, near repeats, and collective efficacy to best forecast when and where crimes are likely to emerge. This all lets you focus on one thing: responding.
In other words, using their pre-existing data on arrests and crime, the technology is going to predict new locations for crime so that police can be there to respond before it happens.
I only have one question, and of course it's rhetorical, and we all know the answer: does this system account for widespread racism in policing? If the data that HunchLab is given by the NYPD and the Miami police department to predict future crimes is skewed by wrongful arrests and illegal detentions, which, accounting for the reality that racism in policing has never been properly detailed on any massive scale, then we can reasonably ensure that the predictive policing technology will simply predict more racist police interventions. This is wrong and unethical on a hundred different levels.
How will it account for the reality that this NYPD detective testified under oath that he and others fabricated charges against innocent people to meet quotas? Will it account for the racist reality than in some places far more white people that are pulled over by police are found with drugs and contraband, but a higher percentage of African Americans end up arrested by those same police? If the data the system uses is based on arrests, which it likely does, and not the presence of drugs that should in fact warrant an arrest, we can already determine that this system will do nothing but advance more racist policing.
Will it count arrests like that of Kalief Browder, in which he spent three years in prison and was then released without ever being charged with a crime? Will it account for incidents like what we saw in Tulsa, Oklahoma, where a 53-year-old white security guard with illegal marijuana in his bag shot an unarmed young man in his own neighborhood, but wasn't arrested for the shooting or the marijuana?
Will it predict crimes by police like when Bill "Robocop" Melendez illegally beat a man and planted drugs on him? Will it predict sexual assaults like when NYPD Sergeant Michael Iscenko tossed his own semen on an administrative assistant in the office? Will it predict moments like when this Charleston, South Carolina, officer murdered Walter Scott? This is bogus.
Andrew Ferguson, Law Professor at UDC, raised a series of ethical questions about this predictive policing system over a year ago as he saw it coming for New York when "stop and frisk" policies were being outlawed. Free to create new systems and policies without any true level of public input, police are given a ridiculous amount of freedom to do what's right in their own eyes, but we must stand against these systems and advocate for greater public input and oversight with all matters of public safety.