San Francisco – a tech-forward metropolis that nevertheless finds penetrating face recognition (FR) as "psychologically uncomfortable" – became the first major American city to ban police use of technology on Tuesday.
Aaron Peskin, the city administrator who sponsored the bill, told the New York Times that the 8-to-1 vote of the Board of Supervisors is sending a powerful message to the nation coming from a city whose DNA has been rewritten by technology.
Many of these technologies are born here and their parent companies live here, he said. So it is a bit up to San Francisco to control them when they do amok, he said:
I think part of San Francisco is the real headquarters for all techies, also with a responsibility for its local legislators. We have an oversized responsibility to regulate the excesses of technology precisely because they have their headquarters here.
Peskin pointed out that the shortcomings of FR mean that it leads to frequent wrong identifications. Example: The American Civil Liberties Union (ACLU) tested facial recognition technology used by the police in Orlando, Florida, and discovered that it wrongly brought 28 members of Congress together with mosquito shots.
So many other cases in the field of this error-prone technology. Here's one: after two years of pathetic failure rates when they used it at Notting Hill Carnival, the London Metropolitan Police finally threw in the towel in 2018. In 2017, the "top-of-the-line" AFR system was that they had tried for two years couldn't even distinguish the difference between a young woman and a balding man.
The new San Francisco regulation says the city doesn't think FR is worth it:
The tendency for face recognition technology to jeopardize civil rights and civil liberties is considerably greater than their supposed benefits, and the technology will exacerbate racist injustice and threaten our ability to live free from ongoing government monitoring.
The reference to racial injustice refers to several reports, including one of the cited studies from the Center for Privacy and Technology at the University of Georgetown, which showed that automatic face recognition (AFR) is an inherently racist technology. Black faces are overrepresented in face databases to begin with and FR algorithms themselves have proven to be less accurate in identifying black areas.
In another study published by the MIT Media Lab earlier this year, researchers confirmed that the popular FR technology being tested includes gender and racial prejudices.
Plus, ubiquitous surveillance is outright nasty, Peskin said:
It is psychologically unhealthy when people know they are being watched in every aspect of the public domain. On the street, in parks … that's not the kind of city I want to live in.
The regulation prohibits the use of FR by police and city agencies and requires that urban departments disclose all surveillance technologies that they currently use or intend to use, and that they describe policies that should be approved by the Board of Supervisors.
It does not affect the use of face recognition technology by personal, business, or federal governments. This means that the use of FR at San Francisco International Airport and the port of San Francisco, both controlled by the federal government, will not be affected.
The regulation will not become a law until the Board of Supervisors ratifies the vote next week, but the second vote is seen as a formality.
Critics say that an outright ban goes too far and does not take into account the positive use of technology. NPR quoted Daniel Castro, vice-president of the Industry-backed Technology and Innovation Foundation, who says that other American cities should not follow the leadership of San Francisco:
They say, in principle, let's ban technology across the board, and that seems extreme because there are many applications of the technology that are perfectly suited.
We want to use the technology to find missing older adults. We want to use it to combat sex trade. We want to use it to quickly identify a suspect in the event of a terrorist attack. These are very reasonable applications of the technology, and so wholesale is a very extreme response to a technology that many people are only just beginning to understand.
Similar legislation is being considered in the nearby town of Oakland, and the Massachusetts Senate is considering a bill that would impose a moratorium on FR software in the state until technology improves.
Despite the error rates, there are still many police forces who are still gung-ho about the use or expansion of technology. One would be the London Metropolitan Police.
Despite signing up for the Notting Hill Carnival and other high-profile failures, police say it helps them capture violent criminals and technology continues to improve.
(tagsToTranslate) law & order (t) privacy (t) biometrics (t) civil liberties (t) face recognition (t) surveillance