‘This Doesn’t Look Good’: Records Show LBPD Failed to Properly Document Over 1,000 Facial Recognition Searches
LA County Sheriff Alex Villanueva and candidate Robert Luna oversaw improper searches.
Newly obtained records show that in recent years the Long Beach Police Department has run over a thousand facial recognition searches without indicating any suspected criminal activity, flouting established transparency rules critical to the internal auditing and oversight of law enforcement’s use of this highly invasive technology.
Instead of crimes, most of these searches simply referenced the 2020 George Floyd demonstration — a constitutionally protected activity.
The “detailed search audit” of the facial recognition queries was reviewed by FORTHE after being turned over by the Los Angeles County Sheriff’s Department (LASD) in response to a 16-month-old public records request submitted by CheckLBPD.org, a local police watchdog organization.
The records cover the period between 2009 and 2020 and show information the LBPD disclosed about each of the over 4,000 facial recognition searches it ran through a vast mugshot database maintained by the LASD. The database, known as the Los Angeles County Regional Identification System (LACRIS), can be accessed by 64 law enforcement agencies within the county and contains millions of booking photos dating back to the mid-1990s, as well as other biometric data. Once an image of a person’s face ends up in the LACRIS database, there is no way to remove it, even if a person is never charged with a crime.
Each time a law enforcement agency conducts a facial recognition search through LACRIS, they are required to fill out a request form, which must list, among other information, a criminal case number associated with each query, according to the sheriff’s LACRIS use policy. This request form must then be approved by LACRIS personnel.
Yet the logs show that between June and November 2020, there were 711 instances in which the LBPD conducted facial recognition searches through LACRIS and listed only “PDPROTEST” — or some variation of this entry — as a reason for the search without disclosing a criminal case number.
These searches came in the months following the anti-police-brutality uprising that saw an estimated 5,000 demonstrators take to the streets of downtown Long Beach on May 31, 2020. A comparatively smaller group of people also vandalized and looted businesses.
FORTHE has previously reported that there was a major uptick in the LBPD’s use of facial recognition technology in the months following the protest.
According to the LBPD, the “overwhelming number of concurrent criminal events involving hundreds of suspects” that day led to the creation of a Looting Task Force. To follow up on leads, investigators from the task force ran facial recognition searches on images taken from security footage that captured suspects who “did not have distinct case numbers.”
Instead, police said, “a broad label was utilized in these searches.”
However, legal experts say that because these searches were tagged with such a generic label and are not associated with a criminal case, it is impossible to independently determine whether the LBPD stayed within constitutional bounds when running images of faces through LACRIS.
“If they were monitoring people who were going to these protests under the theory that these people might be suspicious or they might suspect them of a crime simply because they had gone to this protest, that’s pretty illegal,” said Mohammad Tajsar, a senior staff attorney at the American Civil Liberties Union of Southern California. “We definitely need to know more, obviously, but this doesn’t look good for the City of Long Beach.”
Other law enforcement agencies have been caught using facial recognition technology on Black Lives Matter protesters in the past, including in Baltimore and Florida, where police tagged their search requests with “possible protest organizer” and “associates.”
Greg Buhl, the lead researcher at CheckLBPD.org, said the LBPD’s shoddy recordkeeping should make anyone who attended the protest wary about whether they were being surveilled.
“The LBPD hasn’t given protestors any conclusive reason to believe their faces weren’t run through LACRIS, either as departmental practice or on officer’s individual initiative,” said Buhl.
He noted that the broad label the LBPD chose to input when running these facial recognition searches — “PDPROTEST” — did not refer to a crime but instead to the protest.
“At best the ‘PDPROTEST’ search term shows a preoccupation with being targeted by protests, insufficient oversight at the city and county levels, and a violation of LACRIS policy,” said Buhl. “At worst, it shows violations of the First Amendment by targeting a constitutionally protected protest. With the available data, it is impossible for anyone to say definitively whether this was a legitimate investigation of looting.”
According to the LBPD’s most recent figures, investigations by the Looting Task Force resulted in 78 arrest warrants and an additional 79 cases submitted to the LA County district attorney or city prosecutor.
The audit logs also showed that another 161 LACRIS facial recognition searches run by the LBPD between 2019 and 2020 had “Suspect ID” recorded as the search reason — an even more mystifying entry — and, again, no associated criminal case number.
A Long Beach police spokesperson could not provide an explanation behind these entries, saying only that they were under review and that the LBPD will be “implementing a refresher training to ensure that members adhere to Department policy.”
In another 141 LACRIS searches, the criminal case number and search reason were left entirely blank. In total, at least 1,016 searches — not counting missing person, training, and test entries — did not have criminal case numbers or references to a suspected crime attached to them.
In a recent survey of what city departments use facial recognition technology, the LBPD wrote, “LACRIS and Long Beach Police Department policy strictly governs the use of FRT and is limited to investigation of active criminal cases.” Additionally, police said that all of its employees must successfully complete LACRIS training before they gain access to the system.
The facial recognition searches in question came prior to a special order issued last year by former LBPD Chief Robert Luna limiting use of the technology to instances when police have a “reasonable suspicion the person has committed a crime” or when someone is unable to identify themselves due to a “physical or mental incapacitation or defect, or death.”
The special order also explicitly prohibits officers from using facial recognition technology in any way that would “violate the First, Fourth, and Fourteenth Amendments of the US Constitution.”
Tajsar said that the lapse in proper documentation by the LBPD shows that police should not be entrusted to internally regulate their own use of such powerful surveillance technologies.
“The police departments who empower themselves with this kind of technology should not be trusted to regulate themselves,” he said. “And the only way the public can be assured that police departments are not going to abuse their extraordinary power is to prevent the police departments from having extraordinary power to begin with.”
Lack of Oversight
LACRIS personnel are responsible for reviewing facial recognition search requests submitted by outside law enforcement agencies to ensure they comply with the established use policy. However, the LBPD’s search log sheds light on the lax oversight of this powerful database.
Governing LACRIS is a little-known entity called the Remote Access Network Board, which is made up of seven members. It is chaired by a member of the county Board of Supervisors and also includes the sheriff, the county district attorney, the Los Angeles chief of police, a second chief selected by all the other police chiefs within the county, the mayor of Los Angeles, and a member-at-large chosen by the other board members.
LA County Sheriff Alex Villanueva, who sat on the RAN board while LACRIS approved the LBPD’s “PDPROTEST” searches, is running for reelection. Meanwhile, Luna, under whose watch those search requests were submitted by the LBPD, is one of his opponents and would sit on the RAN board if elected.
“The LBPD falling short of what is required is not surprising,” said Buhl. “But the lack of oversight from LACRIS, an agency created by state law and overseen by a board that includes LA County Supervisor Hilda Solis, LA County District Attorney George Gascón, and LA Mayor Eric Garcetti, truly is.”
At a RAN board meeting last year, a representative from Solis’ office read a statement on behalf of the supervisor promising transparency around LACRIS: “I am committed to ensure that we keep our law enforcement agencies accountable to the public and their operations remain transparent to the public and that extends to the tools we provide to law enforcement. We need to make sure that the tools available are being used not only in a transparent manner, but one of accuracy, to minimize false prosecution and arrests.”
LACRIS, which is staffed by sheriff’s department personnel, did not respond to our questions before press time.
Long Beach Commission Calls for Citywide Moratorium on Facial Recognition Tech
Last month, the Long Beach Technology and Innovation Commission issued a full-throated recommendation to put the brakes on any current or future use of facial recognition technology by the city.
The commission also advocated for the creation of an “independent commission that possesses authority and oversight” over algorithmic and surveillance technology and a “framework for vetting and continuously monitoring all surveillance technologies” that collect personally identifiable data.
Commissioners published a white paper on facial recognition technology in January stating that there are “underlying civil rights, racial equity and justice, and privacy concerns” with the use of the technology.
The LBPD maintains that the technology is an important investigative tool used to generate leads. The police department cited a January 2020 case where an elderly couple was robbed and assaulted, resulting in major injuries. Detectives later uncovered an image of the suspect.
“Through FRT, the still image provided investigative leads that detectives used along with additional corroborating evidence. Collectively, this evidence led to the arrest and conviction of the suspect to a 20-year prison term,” police said.
But despite anecdotal success stories from law enforcement agencies, Tajsar said that there is ample evidence that the algorithms that underlie facial recognition systems and the databases they draw from are inherently racist and classist.
“People who are arrested and who will end up getting convicted of crimes are disproportionately poor and people of color. And they are not a representative sampling of what we might otherwise consider deviant populations,” he said.
A federal study of 189 different facial recognition algorithms found that the software was 10 to 100 times more likely to misidentify nonwhite faces. Additionally, researchers with MIT and Microsoft found that leading facial recognition software had trouble identifying women and people of color, misclassifying Black women nearly 35% of the time. Meanwhile, the study found that white men were nearly always identified correctly.
One of the co-authors of the paper, Joy Buolamwini, said in a TED Talk that these algorithmic biases can lead to law enforcement agencies mislabeling someone as a suspect, especially people of color.
This very scenario played out in Detroit two years ago when a Black man was wrongfully arrested by police and accused of shoplifting based on a facial recognition mismatch, according to a New York Times report. Robert Julian-Borchak Williams spent 30 hours in custody before making bail.
The privacy of the general public is also a major concern. In a letter calling for the complete ban of facial recognition at the federal level, the ACLU wrote, “When combined with existing networks of surveillance cameras dotting our urban and suburban landscapes, FRT algorithms could enable governments to track the public movements, habits, and associations of all people, at all times — merely with the push of a button.”
Notably, the LBPD’s current special order on facial recognition opens the door for just this type of situation: “The Long Beach Police Department may elect to integrate the use of facial recognition technology with its public safety video surveillance.”
Prior to being elected to the Long Beach City Council in 2020, Sixth District Councilmember Suely Saro said she believed facial recognition and other surveillance technology should be banned by the city.
“I am deeply concerned about the uses and potential abuses of facial recognition and surveillance technology. Surveillance is antithetical to a free society, and we should be very skeptical of attempts to implement mass surveillance schemes. Our default position on these technologies should be that they should not be used,” Saro said in a radio interview aired on 99.1 KLBP and conducted by FORTHE. Saro is also the chair of the City Council Public Safety Committee.
San Francisco, Oakland, Berkeley, Santa Cruz, Boston, and New Orleans have all banned or severely limited the use of facial recognition technology by law enforcement in the past few years. However, lobbying from surveillance technology companies has pushed some jurisdictions to rethink these bans.
Last week, members of the Technology and Innovation Commission presented their policy recommendations to the city’s Equity and Human Relations Commission. Civil liberties advocates hope the recommendations will ultimately be brought forth by the city council for a vote.
“One thing is clear about this situation,” said Buhl. “It shows the need for a surveillance oversight ordinance in Long Beach and new limits on how our police can use facial recognition databases.”