Nearly two years after transparency legislation was introduced, a hearing on the oversight of the NYPD’s vast arsenal of surveillance equipment will convene today in the City Council’s Committee on Public Safety. City Council member Vanessa Gibson of the Bronx introduced the bill, the Public Oversight of Surveillance Technologies (POST) Act, in February 2018. It would require the police to publicly disclose descriptions and policies for the use of surveillance gear such as automated license plate readers, drones, cell-site simulators capable of tracking and intercepting mobile communications, predictive policing technology, and gunshot detectors.
The NYPD has amassed a sophisticated array of technical capabilities in part because it has chosen to spend millions in post-9/11 Homeland Security grants on surveillance The NYPD now maintains a networked system of thousands of surveillance cameras, radiation detectors, gunshot sensors, and license plate readers that feed into the Lower Manhattan Security Coordination Center several blocks away from One Police Plaza. Dubbed the Domain Awareness System (DAS) and developed in conjunction with Microsoft, it provides the NYPD with the ability to track New Yorkers through the streets, subway system, public housing, and highways within city limits.
The NYPD has also purchased and used technology such as drones, cell-site simulators, and radiation-detection vans with no public oversight or disclosure, often until the department chooses to publicize the technology or information is pried loose by the media or litigation. The department has the ability to unilaterally withhold contracts from the City Comptroller over security concerns, and it routinely denies Freedom of Information Law requests for records about such systems. The NYPD canceled a multimillion-dollar contract with the data-mining firm Palantir Technologies, key details of which only became public in 2017 after BuzzFeed News uncovered the dispute between the private firm and the NYPD.
A previous iteration of the POST Act died in committee in 2017 in the face of opposition from NYPD officials and Mayor Bill de Blasio. A de Blasio staffer told The Appeal that the mayor has not changed his position on the legislation, and Council Speaker Corey Johnson, who is widely expected to run for mayor in 2021, has not taken a public position on the bill despite endorsing the previous version.
During a June 2017 hearing on the earlier iteration of the bill, senior members of the NYPD, including commissioner for legal affairs Larry Byrne, and John Miller, the department’s deputy commissioner of intelligence and counterterrorism, and its then-chief of detectives Dermot Shea—who now serves as commissioner—made numerous misrepresentations under oath of the NYPD’s surveillance capabilities, technology acquisition procedures, use policies and oversight.
The current bill has over half the City Council—30 members—as co-sponsors, and the endorsements of the Black, Latino/a and Asian Caucus, the Brennan Center for Justice, the New York Civil Liberties Union, and the Surveillance Technology Oversight Project. But de Blasio’s opposition and the lack of support from Johnson may prove fatal.
Community groups, public defenders, and representatives of communities directly impacted by NYPD surveillance are expected to speak at today’s hearing. Officials from the NYPD have also been invited to testify.
Other municipalities have taken more aggressive steps to reveal and regulate the use of surveillance technology by law enforcement agencies. Oakland, California; Seattle; San Francisco; San Diego; Santa Clara County, California; Somerville, Massachusetts; and Detroit have all passed laws to tighten the use policies for surveillance cameras, facial recognition software, cell-site simulators, predictive policing technology and gunshot detectors.
Many of these municipalities now have permanent committees to oversee and determine the purchase and use policies for new technology acquisitions by local law enforcement. In comparison, New York City’s POST Act would only require public disclosure of such technologies and use policies. The NYPD would still retain the ability to draft and enforce privacy and use policies for surveillance technology, making the proposed legislation far weaker than bills enacted elsewhere.
As the debate around the POST Act has dragged on, several of the NYPD’s high-tech intelligence and evidence-gathering methods have come under fire. A gang database used to track tens of thousands of overwhelmingly Black and Latinx New Yorkers with alleged or admitted ties to street gangs throughout the city has been the target of a sustained campaign by activists who are pushing for more transparency and regulation of how the department determines who is included on the list. Last Thursday, Legal Aid announced a campaign called #EraseTheDatabase NYC to abolish the gang database. Both city and state legislators have requested a formal audit of the database by the NYPD’s inspector general. Criticism of similar gang databases in California and Cook County, Illinois, has led to legislative and policy reforms.
NYPD’s collection, retention, and use of other information it has collected about New Yorkers, such as mugshots optimized for facial recognition and DNA samples, are also being scrutinized. Photographs of children have ended up in the department’s mugshot system, and hundreds of African American men had their DNA samples taken by detectives during the hunt for the suspect in the 2016 rape and murder of Katrina Vetrano in Howard Beach.
The department’s DNA database has grown by a third in recent years to include over 80,000 profiles, including at least 30,000 people who have not been convicted of or charged with a crime. Practices like surreptitiously obtaining and storing the DNA of children have prompted state legislators to call for regulation of the NYPD’s collection of genetic material.
Shea defended both practices while he was in charge of the NYPD’s detective bureau.
The department still opposes the legislation. “The bill, as currently proposed would literally require the NYPD to advertise on its website the covert means and equipment used by undercover officers who risk their lives every day. No reasonable citizen of New York City would ever support that,” John Miller, the deputy commissioner of Intelligence and Counterterrorism, wrote in an email to The Appeal. Miller said that the bill, in its current form, would “directly endanger the lives of undercover police officers, cooperating witnesses and our citizens who would be the victims of violent crimes that would not be prevented.”
Earlier this year, key task force members resigned in frustration over intransigence and opacity by city officials, a refusal by city agencies to turn over data or provide information about what algorithmic decision systems were in use, and a narrowing of the task force’s goals. Furthermore, the final language of the task force’s authorizing legislation included an exemption that barred the committee from examining the NYPD’s predictive policing system, even though the Brennan Center sued for and obtained documentation about the department’s design and implementation of a crime forecasting technique that has drawn criticism.
In advance of today’s hearing, advocacy organizations are ramping up their community education and outreach efforts surrounding the proliferation of automated decision-making systems and algorithmic tools by city agencies, including the NYPD.
On Dec. 7, over 150 people attended a workshop on algorithmic technologies co-hosted by the NAACP Legal Defense and Educational Fund and 17 other groups and held at the Riverside Church in Manhattan’s Morningside Heights neighborhood. Among the technologies discussed was the NYPD’s predictive policing algorithm, gang databases, and software used to monitor the social media of activists and students. Rashida Richardson, the director of policy research at the AI Now Institute, said these systems are shrouded in secrecy and the public is not given the chance to ask critical questions.
“What is the problem with the data being used, whose worldview is being projected by that data, and how is that affecting all of us?” she said. “We don’t live in an equal world, we don’t live in a society where we’re all treated the same.”