Most Americans have probably never heard of “predictive policing,” even if their lives already have been deeply affected by it. And one of the nation’s largest police forces, the Chicago Police Department, thanks to federal funding, is now helping to drive policing into territory previously only dreamed of in science fiction: The ability to essentially predict who will be the next perpetrator or the next victim of a crime.

To minority groups, the mere thought of such a police tactic immediately raises all kinds of questions about civil rights and racial profiling.

At the heart of this trend in how police departments operate are complicated computer algorithms that run off highly detailed crime data. In the last couple of years, police departments around the country, from Los Angeles to Memphis, Tenn., to Charleston, S.C., have started forecasting future crimes, many of them thanks to federal funding. In the case of L.A., the algorithm being used once helped predict earthquake aftershocks.

About half of LAPD's divisions or precincts are crunching the numbers but no one there is quite as successful at making it work as Capt. Sean Malinowski, who oversees the "Foothill" division in the San Fernando Valley.

“I’m leading L.A. with a 29 percent reduction year to date in crime,” Malinowski told Fox News Latino.

During roll call, police officers are given forecast maps that highlight the 500-by-500 square-foot areas – one city block - where auto and property thefts are most likely to take place, down to the hour.

Malinowski worked with a team of academics at UCLA to develop an algorithm that he has fed with seven to 10 years of local crime data. It works because, like most people, criminals are creatures of habit, and the algorithm helps find their patterns, he said.

Malinowski then sends cops to spend extra time in those high-risk zones, where they are likelier to either catch a crime in progress or deter it from occurring.

“When the officers don’t get the maps, they are asking for them,” he said.

Malinowski pointed out that he does not use arrest data when formulating his algorithm, only what crime occurred and where.

“I think you focus on the place,” he explained. “There is a fine line when you start forecasting about individuals. Chicago may have figured it out.”

Many people do seem to believe that the Chicago Police Department has figured out a way to apply predictive policing to the question of not just the place where crimes will be committed, but the people who will be breaking the law.

But others are wary.

“It’s one thing to predict where a crime is likely to happen,” Andrew Guthrie Ferguson, an assistant law professor at the University of the District of Columbia, said, “but to predict who? That’s really your ‘Minority Report’ world.”

The “Heat List”

In “The Minority Report,” the futuristic 2002 Steven Spielberg/Tom Cruise thriller, a special police unit found a way to harness the power of three clairvoyants in order to arrest people for crimes they had yet to commit.

Instead of “precogs,” the Chicago Police Department, or CPD, has Andrew Papachristos, the Yale researcher whose studies of the Garfield Park and Lawndale neighborhoods found homicide rates three times the Chicago average. What’s more, he found that perpetrators and victims shared some behavioral traits — they had been victims of previous shootings, had long arrest records and associated with others who shared those characteristics.

“It’s just like sharing needles,” Papachristos told the Chicago Tribune last year. “It puts you at risk because of the behaviors of your friends and associates.”

Using an experimental algorithm developed by Papachristos, the CPD has developed what it calls the “heat list” — roughly 400 people in the city who are supposed to be most likely to be involved in violent crime either as a victim or offender.

One by one, the people on this list are paid a house visit by CPD officers who let them know that their lives are at risk. And that the cops are keeping an eye on them.

But the program is much more than a local phenomenon. It’s funded by a federal grant from the National Institute of Justice, and its ambitions are universal. As Commander Jonathan Lewin, who is in charge of information technology for the CPD, told the technology web site, The Verge, the program “will become a national best practice. This will inform police departments around the country and around the world on how best to utilize predictive policing to solve problems. This is about saving lives.”

A Self-Fulfilling Prophecy

Civil rights and privacy advocates aren’t so sure. They’re concerned that predictive policing will result in the loss of Fourth Amendment rights — specifically, the prohibition against unreasonable searches and seizures.

“If a predictive tip says a particular area is going to be the place of a burglary,” said Ferguson, who has written extensively in law reviews about the topic, “and you show up and see someone a holding bag outside of a house — normally that’s not enough to stop someone for a burglary. But with a predictive tip maybe it is enough for reasonable suspicion.”

He continued: “You better be sure that algorithm is accurate. You better be sure the data is good, and that it is updated and makes sense. Otherwise, you could be reducing civil liberties and Fourth Amendment protections for certain people on bad information and bad data.”

Probably the most common criticism of predictive policing comes from activists who say the algorithms give police a smokescreen of objective and neutral-seeming results, but police and researchers control which crime or arrest data gets input into the algorithm, thereby influencing the results. The science can then become a justification for racial and ethnic profiling of minorities, the argument goes. If the crime data is skewed unfairly toward Latino and black populations, then the algorithm will support more police scrutiny in their neighborhoods.

“It ends up being a self-fulfilling prophecy,” said Hanni Fakhoury, staff attorney at the Electronic Frontier Foundation, a nonprofit digital civil liberties organization. “The algorithm is telling you exactly what you programmed it to tell you. ‘Young black kids in the south side of Chicago are more likely to commit crimes,’ and the algorithm lets the police launder this belief. It’s not racism, they can say. They are making the decision based on what the algorithm is, even though the algorithm is going to spit back what you put into it. And if the data is biased to begin with and based on human judgment, then the results the algorithm is going to spit out will reflect those biases.”

Critics of predictive policing point to marijuana arrests as an example of biased crime data. White Americans smoke marijuana at about the same rate as blacks and Latinos, yet they are prosecuted and convicted for marijuana crimes at disproportionately lower rates. If that arrest data is put into computer algorithms to help predict where crime will happen in the future, the results will mean more targeted police patrols in minority neighborhoods.

Malinowski of LAPD — whose division includes 250,000 residents, most of whom are Latino — observed that predictive policing hasn’t replaced good, old-fashioned policing. But he added that Fourth Amendment infringement is a “legitimate concern.”

The New "Stop and Frisk?"

Biased data is just what Foster Maer, senior litigation counsel at Latino Justice PRLDEF, an advocacy group in New York City, is afraid of. Maer is part of the team that has been fighting against New York City’s controversial "Stop and Frisk" program, which, last fall, a federal judge found unconstitutionally targeted black and Latino men.

At the height of the policy in 2011, New Yorkers were stopped by police more than 685,000 times — of which 87 percent of those stopped were black and Latino.

“Stop and Frisk was targeted and that has produced racially biased conviction rates and arrests,” Maer said. “Race is not a direct part of the algorithm, but because the data is racially biased, the names that come out will be racially biased. Predictive Policing will replicate and expand the program at an individual level like we are seeing in Chicago.”

Though New York City has not adopted predictive policing, there is a possibility it will after the nomination of Bill Bratton as police commissioner.

Bratton led the LAPD from 2002 to 2009 and is frequently credited for implementing the policy. A former chief of the New York City Transit Police, Bratton is also credited for bringing CompStat to New York in 1995, a computerized system that maps out crimes and crunches weekly data in an attempt to most effectively place cops in the places where crime is happening and which is generally considered a precursor to predictive policing.

Between 1990 and 2011, the homicide rate in New York City declined by 80 percent, robbery by 83 percent, burglary by 86 percent and car theft by 94 percent, according to the New York Times.

“If I were someone like Bill Bratton,” Ferguson said, “I would have a press conference and say, ‘You know what? Don’t worry about Stop and Frisk. We have a new solution: It’s called ‘predictive policing’ and I’ve been touting it since I was in L.A."

If you listen to law enforcement voices like that of LAPD police chief, Charlie Beck, predictive policing is already ushering in the next era of policing.

“It’s here to stay, as long as the data seems to work,” Ferguson explained. “It’s not just the future, it’s the present.”

Bryan Llenas currently serves as a New York-based correspondent for Fox News Channel (FNC) and a reporter for Fox News Latino (FNL). Follow him on Twitter @BryanLlenas

Follow us on twitter.com/foxnewslatino
Like us at facebook.com/foxnewslatino