(S.F. Examiner file photo)

New surveillance oversight law keeps communities safe and redefines tech leadership

Technology should work for the public good, not against it.

By Matt Cagle and Brian Hofer

Technology should work for the public good, not against it. Yet, San Francisco’s city departments are currently permitted to use invasive, high-tech surveillance systems without consulting with residents or setting up basic rules to keep us safe. The harms that technologies like drones, automatic license plate readers, and face recognition can inflict are real and will fall hardest on our already-marginalized community members.

Next week, San Francisco’s Board of Supervisors will vote on a law, authored by Supervisor Aaron Peskin, that ensures surveillance technology is considered and used responsibly by requiring public debate, clear use policies and a final Board vote. The ordinance also specifically prevents the city from deploying face surveillance technology.

The legislation is supported by the ACLU and a broad coalition representing immigrants, people of color, the homeless, the LGBTQ community, and others who are most subject to abusive surveillance. San Franciscans should ask their supervisors to pass this law.

Opponents of the legislation say that democratic oversight is impractical and would stop residents from sharing information with the city. But this process works – six other Northern California localities have adopted similar laws. And the ordinance explicitly allows city departments to accept and use tips from the community.

San Franciscans have experienced the danger of hastily deployed surveillance firsthand.

For instance, SFPD pulled over Denise Green, a Black woman, when a patrol car’s automated license plate reader mistakenly indicated that her car was stolen. License plate readers are known to have a 10 percent error rate, but there were no policies requiring officers to verify automated readings. The police forced Ms. Green out of her car, to her knees, and held her at gunpoint.

Ms. Green’s story demonstrates that unaccountable surveillance makes us less safe and less free. We know that surveillance technology is used most often against people of color and immigrants, who are, in turn, most in danger of racially biased violence.

This ordinance also recognizes the unprecedented dangers of face surveillance— a new technology that, as a New York Times experiment showed, exploits public camera feeds to secretly track people by scanning their faces against photo databases.

Experts have warned face recognition is inaccurate for people of color and women. But even if it were completely accurate, the city should still reject it.

Face surveillance is incompatible with a healthy democracy. In China, it’s already being used to profile and control a largely Muslim ethnic minority. In one Chinese city, a once-bustling public square became desolate after this technology was installed.

If unleashed, face surveillance would suppress civic engagement, compound discriminatory policing, and fundamentally change how we exist in public spaces.

A young adult should have confidence that the city isn’t logging their first visit to a gay bar. A Muslim resident should not worry their visit to a mosque will place them on a watchlist. And an immigrant should be able to show their face in public without fear of deportation.

Modern technology gives the government unprecedented surveillance powers. To put things in perspective: in 1973, the SFPD possessed intelligence files on over 100,000 people, including civil rights demonstrators, union members, and anti-war activists. These records took decades to amass.

Today, city police can stockpile information on 100,000 residents in a few hours.

The legislation before the Board brings these systems out of the shadows with a simple process of public accountability that also ensures that San Francisco lives up to its sanctuary promise. Indeed, a recent ACLU report found that the Trump administration is trying to use data from local surveillance systems to locate and deport immigrants.

An overwhelming majority of Bay Area voters support laws requiring oversight and transparency of government surveillance and oppose the government’s use of face recognition.

San Francisco sits at the center of innovation; by passing this law, the Board of Supervisors can redefine what tech leadership means.

SPEAK UP Ask your supervisor to support the “Stop Secret Surveillance” ordinance by emailing Board.of.Supervisors@sfgov.org

Matt Cagle is a Technology and Civil Liberties Attorney at the ACLU of Northern California. Brian Hofer is the Executive Director of Secure Justice.

Just Posted

Showdown looms over ‘contentious’ proposal to compel homeless into treatment

Unclear if Conservatorship legislation has votes to win approval by the Board of Supervisors

Massive police response for possible active shooter in Potrero Hill appears to be false alarm

Search of building has found no evidence of shots fired or located any victims

Facing scrutiny, Mayor Breed says she’s ‘not okay’ with police raids on journalists

‘The more we learn, the less appropriate it looks to me,’ mayor says of SFPD action

Photos: Sun comes out for Bay to Breakers 2019

The forecast predicted rain for Sunday’s seven-mile-long day drinking extravaganza, Bay to… Continue reading

Report: Make Muni free to offset ride hail impacts

Study suggests best response to influx of Uber drivers is expanding access to public transit

Most Read