Get Informed

Subscribe to our newsletters for regular updates, analysis and context straight to your email.

Close Newsletter Signup

LAPD retreats from ‘predictive policing,’ for now


What you’ll read today

  • Spotlight: LAPD retreats from ‘predictive policing,’ for now

  • ‘They sent him to his cell to die

  • ‘I can’t afford it and I never will be able to’

  • New York has become the first major U.S. city to provide free phone calls from jails

  • No reduction in court appearances after Philadelphia DA stops seeking cash bail for many offenses

  • Louisiana bill would make death penalty drugmakers secret

In the Spotlight

LAPD retreats from ‘predictive policing,’ for now

A recent episode of the philosophy podcast “Hi-Phi Nation” began at a weekly meeting held by the Los Angeles Police Department’s commissioners, a group of five civilians and volunteers who are appointed by the mayor. That day, they were voting on whether to approve a charitable donation to reconfigure a conference room into a Community Safety Operations Center (CSOC). It sounded innocuous. It was not. Protesters shouted “shame!” until they were ejected. Barry Lam, an associate professor of philosophy at Vassar College and the host of the podcast, said the funding in question was controversial because it represented “a small but symbolic step in LA’s ongoing move toward predictive policing technologies.” Police say that big data is a chance to replace the prejudice of human judgment with impartial data and algorithms. For opponents, algorithmic objectivity is a cover for an “efficiency tool to target, incarcerate, and control racial minorities in a rapidly gentrifying city.” [Barry Lam / Hi-Phi Nation]

“For years, critics have lambasted data-driven programs—which use search tools and point scores—saying statistics tilt toward racial bias and result in heavier policing of black and Latino communities,” writes Mark Puente for the Los Angeles Times. A recent study examined such data programs in Chicago, New Orleans, and Maricopa County, Arizona, concluding that “dirty data” led to biased policing and unlawful predictions. [Mark Puente / Los Angeles Times]

Sarah Brayne, an assistant professor of sociology, embedded herself for years with the LAPD, studying how these new technologies are changing the relationship between the police and the community. One part of the system quantifies civilians according to risk, premised on the idea that a small percentage of high-impact people are disproportionately responsible for most crime. But it tracks a lot more than that. Police use index cards to write down information that comes up in interviews with civilians, not just criminal activity. [Sarah Brayne / American Sociological Review]

The cards list people who happen to be in the car with someone during a stop, someone across the street, a neighbor who walked by. This information is entered into the system daily, and officers can then run the names through Palantir software, which can then give a social network map for an individual, “who in the past they’ve been seen with, cars they’ve driven, where in the neighborhood they’ve been stopped at and so forth,” Lam explains.

In order to determine risk, a system assigns points based on certain factors: five points for someone who was arrested with a handgun, for example, or has a conviction for a violent crime, and one point for every consensual police interview. But this means that if officers stop you and ask for information, and you willingly give it to them, and this happens five times, you will not only be in the system, you will have the same score as a person caught with a gun. “You can see how that might turn into somewhat of a self-fulfilling prophecy [or] a feedback loop,” Brayne said. “Where if you’re going out and specifically seeking out the people with high points values, and then you go and stop those people, and then that increases their points value.” [Barry Lam / Hi-Phi Nation]

“Algorithms like these are often deployed in secret, making it impossible for the public to scrutinize them,” Ben Green, a Ph.D. candidate in applied mathematics, wrote for the Boston Globe recently. “Police in New Orleans quietly used predictive policing algorithms for several years without ever announcing they were doing so. Even members of the City Council were left in the dark.” Chicago’s police department has resisted repeated calls to disclose how its algorithm tries to predict gun violence. “By providing the appearance of a value-neutral solution to policing issues without addressing the underlying problems, these algorithms grease the wheels of an already discriminatory system,” Green writes. “They may make policing more efficient for officers, but they don’t evaluate whether the current system actually helps address social disorder.” Studies by the RAND Corporation “found no statistical evidence that these programs actually reduce crime.” [Ben Green / Boston Globe]

In Los Angeles, big data has backed down. Police Chief Michel Moore announced last month that he plans to scrap the program, bowing to criticism from community groups and a 52-page audit by the inspector general, which “found that the department’s data analysis programs lacked oversight and that officers used inconsistent criteria to label people as ‘chronic offenders,’” reports the Los Angeles Times. It found that 44 percent of those labeled chronic offenders had either zero or one arrest for a violent offense. Many had been accused only of nonviolent crimes. The points system and tracking database had been suspended in August, after an uproar among civil liberties groups. [Mark Puente / Los Angeles Times]

But even if these algorithms were more accurate, they might not be justified. Philosopher Renee Bollinger questions the idea that the likelier something is statistically, the more justification we have for treating it as true. In a recent article, she uses the example of John Hope Franklin, a preeminent historian who was about to be presented with the Presidential Medal of Freedom and hosted a celebratory dinner party at an exclusive club. All the other Black men present were uniformed attendants. A woman saw Franklin, who was Black, mistook him for an attendant, and asked for her coat. Statistically, the woman was perhaps justified in making that assumption, but she could not be certain, and the cost of her error was very high. [Renée Bollinger / Synthese]

In some cases, Bollinger writes, “the severity of the harm of a single, isolated mistake suffices to explain the wrong involved in unjustified acceptance.” In others, however, the wrong arises from the pattern of repeatedly exposing people to risks of harm over time. Often, those in charge of implementing policies such as big data policing or stop-and-frisk focus only on the potential harm they prevent, and have no conception of the harm they inflict.

Big data policing has a fundamental problem beyond the possibility of mistakes: the idea that some people are “criminals” who need to be weeded out. The notion that one could mistake an innocent person for a person prone to criminality seems to misunderstand that people’s behavior is shaped by their environments. Constantly being stopped by police is part of that environment. It can make people feel alienated, like society does not want them; some of them might then become less likely to participate in society in a law-abiding way. Meaningful community investment might likewise encourage law-abiding societal participation.

Anytime police arrest a person, or even detain her for a while in the street, they are imposing a penalty. They take her freedom, her time, and her dignity. As law professor Adam Kolber reminds us in his article “Punishment and Moral Risk,” if we as a society want to harm people through punishment, we need to find ways to do so that are morally permissible. For some, this means a high degree of certainty that the person is morally “deserving” of that punishment, and that the punishment is reasonable. For others, it would mean certainty that the action will make society safer. But, as studies have shown, big data policing, which ensnares plenty of people with no criminal involvement, and has not been shown to increase safety, has so far failed on both.

Stories From The Appeal

Photo illustration by Elizabeth Brown. Photo Courtesy of the Westchester Government Instagram

‘They Sent Him to His Cell to Die.’ Rashad McNulty entered a guilty plea in a series of federal gang indictments in New York that have been criticized as racist and overly punitive. But before McNulty was even sentenced, he died in jail. Now, his family is seeking justice. [Aaron Morrison]

‘I Can’t Afford It and I Never Will Be Able To.’ Florida is poised to pass a law that imposes a ‘poll tax’ on thousands of formerly incarcerated people. [Kira Lerner]

Stories From Around the Country

New York has become the first major U.S. city to provide free phone calls from jails: Mayor Bill de Blasio announced that the city has now fully implemented the law passed by the City Council last August. “With free phone calls, we’re … ensuring that people in custody have the opportunity to remain connected to their lawyers, families and support networks that are so crucial to re-entry into one’s community,” he said. Previously, people were charged 50 cents for the first minute of a phone call and 5 cents per additional minute. Now, people can make free 21-minute calls every three hours to anywhere in the country. In Connecticut, lawmakers are considering a bill that would provide free calling for people in the state’s prison system, which would make it the first state to do so. [Associated Press]

No reduction in court appearances after Philadelphia DA stops seeking cash bail for many offenses: Megan Stevenson, a law professor and economist from George Mason University, has studied cash bail in Philadelphia and found, in addition to a large racial disparity, that “pretrial detention leads to longer sentences, increased probability of pleading guilty, and higher court fees imposed on defendants,” as the Philadelphia Inquirer’s editorial board wrote. But in evaluating the impact of District Attorney Larry Krasner’s bail reforms, she found “fewer innocent defendants held, no reduction in court appearances, and fewer opportunities for class and racial disparities—that’s the outcome of not seeking bail.” [Editorial Board / Philadelphia Inquirer]

Louisiana bill would make death penalty drugmakers secret: “Death penalty supporters in the Louisiana Legislature are trying to shroud the source of the state’s execution drugs in secrecy, a move intended to make it easier” to put people to death, writes Bryn Stole for The Advocate. Prison officials have struggled for years to obtain the drugs necessary for executions after the pharmaceutical companies that manufacture them started refusing to sell to prisons that executed people. Several other states have enacted similar laws. Some states, including Texas, have “turned to compounding pharmacies—specialty shops that mix their own medications from raw materials—to obtain the cocktail of drugs necessary for executions.” [Bryn Stole / The Advocate]

Thanks for reading. We’ll see you tomorrow.

Have a tip for The Appeal? Write to us at tips@theappeal.org. A good tip is a clear description of newsworthy information that is supported by documented evidence.

Get Informed

Subscribe to our newsletters for regular updates, analysis and context straight to your email.