Support Independent Journalism. Donate today!

After A Long Fight, Facial Recognition Technology Is In One New York School District

A small school district spent $1.4 million to equip surveillance cameras with the technology.

Spotlights like this one provide original commentary and analysis on pressing criminal justice issues of the day. You can read them each day in our newsletter, The Daily Appeal

In 2015, a security consultant approached the Lockport City School District in western New York, offering to do a free threat assessment. The assessment of the schools was free but the recommended course of action was far from it: the purchase and installation of a camera system equipped with facial recognition technology for $1.4 million. The district followed the recommendation and in 2017, New York State Education Department authorities approved the project.

The decision was met with local skepticism and resistance, drew criticism from the New York Civil Liberties Union, and was eventually suspended as state authorities reviewed the implications of the project. Last month, after changes to address the state’s concerns were approved by the school board, the technology was turned on, making the Lockport school district the first in the state to use facial recognition software. Other school districts have also expressed interest in doing so.

In June, when the outcome was still up in the air, Jim Schultz, a local parent, columnist for the Lockport Union-Sun and Journal, and one of the earliest and most vocal critics of the plan, described his concerns in an opinion article in the New York Times. He noted that the school district planned to use money from a $4 million state grant for technology upgrades. He wrote: “While high-technology security is among the allowed expenditures under the Smart Schools Bond Act, it’s doubtful that facial-recognition technology is what voters had in mind. Neighboring districts invested their money in iPads and faster internet, while we bought spy cameras.”

Schultz highlighted what opponents of facial recognition around the country and the world have rallied against—the potential for error, the amplification of racial bias, and the privacy implications of a surveillance system that stores data about individuals’ movements and activities over a period of time. These concerns take on heightened importance in a school setting, given that students of color are disproportionately pushed out of school and into the criminal legal system. (Nearly 30 percent of students in the Lockport City School District are students of color, according to information from the state Department of Education website.)

In 2018, John Curr of the Buffalo chapter of the NYCLU criticized the proposal in an interview with the Buffalo News. “Tracking every move of students and teachers is not the best way to make them feel safe at school and can expose them to new risks, especially for students of color who are already over-policed in the classroom.” He continued: “Facial recognition software can be highly inaccurate, especially when it comes to identifying young people and people of color. This plan sets a dangerous precedent for constant surveillance of young people and risks exposing data collected about students and educators to misuse by outsiders or law enforcement.”

Schultz looked at how the use of the technology, even if narrowly tailored at first, could slowly expand. “The technology’s potential is chilling,” he wrote. “When Mr. Olivo [the consultant] was pitching the system, he explained that it would have the capacity to go back and create a map of the movements and associations of any student or teacher the district might choose. It can tell them who has been seen with whom, where and how often.”

“Even though district officials promised to never use the software in that way,” Schultz continued, “if we have learned anything from the privacy breaches at Facebook and elsewhere, what matters is not what those in charge promise but what an intrusive technology has the capacity to do.”

Around the world, the contest between privacy and racial justice advocates on one hand and proponents of facial recognition technology as a law enforcement tool on the other has yielded mixed results. The cities of San Francisco, Somerville, Massachusetts, and Oakland, California, have all banned law enforcement use of the technology, citing civil liberties and racial justice concerns.

In London, the Metropolitan Police announced last month that cameras equipped with facial recognition technology will be installed across the city, in popular shopping and tourist locations, reported The Verge. The stated purpose is to “scan for faces contained in ‘bespoke’ watch lists, which the Met says will predominantly contain individuals ‘wanted for serious and violent offenses.’” Previous use of facial technology by United Kingdom police had been limited to trials and concerts and football matches.

In China, the government has used the technology to track members of the Uighur minority. The New York Times reported in April that “it is the first known example of a government intentionally using artificial intelligence for racial profiling,” according to experts. More than a million ethnic Uighurs and other Turkic minorities have been placed in what the government calls re-education camps, where they have been subjected to a program of forced labor and brainwashing.

In the United States, Wired reported last year that at least eight other school districts in the country have deployed the technology. Hundreds of law enforcement departments across the country, airports, and even venues now use facial recognition software.

After the technology was activated in the Lockport schools surveillance system last month, Shultz said the district “turned our kids into lab rats in a high-tech experiment in privacy invasion.”

In his article last year, he described Lockport as “a beautiful small town that sits astride the Erie Canal just a short drive from Niagara Falls. It is a place where the usual debates are about things like where to shoot off our Fourth of July fireworks, not about artificial intelligence aimed at students.” His daughter told him, “It’s creepy that these cameras can watch you and can figure out who you are. We don’t even know who is watching us.” He remarked, “Being spied on like dissidents is not part of the high school experience that any of us would want for our children. Not here, not anywhere.”