From Slate:

Police departments have long been in the data game, with such efforts as CompStat. But there's a new twist: They're not just using statistics to assess the past. Now they're trying to predict the future. In November 2009, the National Institute of Justice held a symposium on "predictive policing," to figure out the best ways to use statistical data to predict micro-trends in crime. The Los Angeles Police Department then won a $3 million grant from the Justice Department to finance a trial run in predictive methodology. (The grant, like the rest of the 2011 federal budget, is pending congressional approval.) Other police departments are giving predictive policing a shot, too, from Santa Cruz, which recruited a Santa Clara University professor to help rejigger their patrol patterns, to Chicago, which has created a new "criminal forecasting unit" to predict crime before it happens….

Predictive policing is based on the idea that some crime is random—but a lot isn't. For example, home burglaries are relatively predictable. When a house gets robbed, the likelihood of that house or houses near it getting robbed again spikes in the following days. Most people expect the exact opposite, figuring that if lightning strike once, it won't strike again. "This type of lightning does strike more than once," says [UCLA anthropology professor Jeffrey] Brantingham. Other crimes, like murder or rape, are harder to predict. They're more rare, for one thing, and the crime scene isn't always stationary, like a house. But they do tend to follow the same general pattern. If one gang member shoots another, for example, the likelihood of reprisal goes up….

Data-driven law enforcement shows that the criminal mind is not the dark, complex, and ultimately unknowable thing of Hollywood films. Instead, it's depressingly typical—driven by supply, demand, cost, and opportunity. "We have this perception that criminals are a breed apart, psychologically and behaviorally," says Brantingham. "That's not the case."