CHIRAQI went back to my hometown of Chicago for 24 hours, earlier this week. I didn’t bring a gun because I don’t need to bring a gun. You can find ’em in any dumpster or playground around the city.

For the past few years, Chicago has been a mess. The cops have been using a software program that is allegedly not racist — although seems pretty racist to me — to predict new patterns of gun violence and warn previously convicted felons not to get any ideas.

The program is called “The Heat List.”

(Clearly it’s working well because Chicago is doing pretty well. Gun violence is down. Recidivism is at an all-time low. Oh, wait.)

The program is fairly small, and it’s one of the many methods that the Chicago Police Department is using to combat crime. Instead of coming down harder on drug cartels and being politically courageous in a tough climate, it’s easier to ask a computer to do what cops, judges, mayors and parents should be doing for their communities.

Cowards. All of them.

Speaking of cowardice, a demoted executive in Chicago shot his CEO and then committed suicide. Garry F. McCarthy, Chicago’s police superintendent and the guy in charge, said, “This is basically a personal thing.”

It’s as if the only crimes that can be prevented — or even warrant any police attention — are crazy gun crimes that happen in Engelwood or Austin. Except McCarthy doesn’t seem to give a rip about those, either, and blames multiple other factors (including misreported crime statistics) for what’s happening on the ground.

What a jerk.

You can’t blame the citizens of Chicago for arming themselves and inadvertently creating more gun violence because it’s not like anyone from the police department to the mayor’s office can offer better protection beyond an algorithm.

(An algorithm that’s clearly not working, by the way.)

So you want to talk about 21st century policing and predictive analytics? Well, here is what we’ve learned from human resources: you can lead a horse to water, but you can’t make him drink. Much of the promise of “predictive analytics” is just that — a promise. The results from much of the software on the market are obvious an inane. It is easy to predict that an employee will quit her job because most people quit jobs. It’s harder to understand why someone would quit and take a job for less pay. Software still doesn’t answer tough, nuanced questions.

Far too many people believe that someone or something — your dad, the CEO, a computer program — can be an omnipresent narrator and explain the unexplainable. Unfortunately, computers aren’t omniscient. No algorithm will solve complex problems related to deviant human behavior.

And no algorithm will save Chiraq.


  1. While the tendency to want to deflect accountability has been around awhile, it seems like software and its resulting analytics promises make it easier these days.

    Just because you have data doesn’t mean you know what to do with it, and just because you have an algorithm that suggests a certain percent probability doesn’t guarantee it will happen, nor does it suggest actions to take in the moment.

    It feels like we (human-type people) are doing everything we can to avoid a difficult conversation. Only now, instead of saying, “HR said I had to,” people can say, “The predictive model said I had to.”

    (By the way…thanks for bringing back comments!)

Comments are closed.