Legislators in California city vote to outlaw surveillance tech
San Francisco is set to ban the use of facial-recognition technology by government entities throughout the city.
The city’s Board of Supervisors this week voted, by a margin of eight to one, in favour of introducing the Stop Secret Surveillance Ordinance. Two of the city’s 11 supervisors did not vote.
The proposed legislation effectively bans any local public-sector entity from using facial-recognition in San Francisco. It also implements an approval process for the use of other types of surveillance kit.
The ordinance will pass into law on 1 July – subject to the supervisors voting the same way in a second, confirmatory poll to take place shortly. The legislation is based on the Stop Surveillance Act authored by California senator Jerry Hill.
The San Francisco Board of Supervisors indicated broad support for the act in a resolution published last year.
Related content
- Police ethics body to look at use of facial-recognition technology
- Interview: Surveillance Camera Commissioner discusses his mission to protect privacy and human rights
- Public ethics body to examine Home Office data use
They said: “Surveillance technologies have been used recently to partner with private security companies to surveil environmental activists, indigenous leaders and community members to control protests of the Dakota Access Pipeline and Keystone XL Pipeline; to allow law-enforcement provocateurs to infiltrate those same protests; to successfully implement no-fly zones to black out media coverage during heightened law-enforcement crackdowns; to profile communities for the purposes of creating false associations and characterisations of peaceful protesters as domestic terrorists; to scrutinise and surveil Black Lives Matter activists and label them ‘Black Identity Extremists’; and to otherwise surveil individuals and groups over extensive periods of time, raising extensive civil liberties concerns.”
In the UK there is currently no dedicated regulation of facial recognition and other biometric technologies – which include fingerprint and voice recognition – despite calls for such legislation from Paul Wiles, the biometrics commissioner.
Responding to a written parliamentary question last week, immigration minister Caroline Nokes earlier this month said the government is considering how best to regulate the biometrics sector and will assess the available options prior to proposing any new laws.
“We are currently considering options for review,” she said. “The review will also look at other measures that can be taken to improve governance and use of biometrics in advance of possible legislation.”
Several UK police forces have trialled the use of facial-recognition software at large public events such as London’s Notting Hill Carnival, and the 2017 football Champions League final in Cardiff.
Privacy advocacy groups have lobbied vigorously against the use of the technology, and information commissioner Elizabeth Denham has also expressed strong concerns.
Last year she said: “How facial-recognition technology is used in public spaces can be particularly intrusive. It’s a real step change in the way law-abiding people are monitored as they go about their daily lives. There is a lack of transparency about its use, and there is a real risk that the public-safety benefits derived from the use of FRT will not be gained if public trust is not addressed. A robust response to the many unanswered questions around FRT is vital to gain this trust.”
But police forces have defended deployments to date. Both South Wales Police and London’s Metropolitan Police Service have stressed that all uses of the technology thus far have been solely on a trial basis.
In a statement issued in light of Denham’s comments a year ago, the Met said: “All our deployments during the trial have been and will be overt, with information disseminated to the public, and will be subject to full evaluation at the conclusion of the trial.”