On May 14, San Francisco’s Board of Supervisors voted to outlaw the use of facial recognition technology by city law enforcement. The group made the decision to prevent future abuse of the controversial biometric technology. The Northern California metropolis is now the first major American city to ban police face scanning.
City supervisor Aaron Peskin said it was important for the local government to take the lead in creating biometric surveillance legislation. “We have an outsize responsibility to regulate the excesses of technology precisely because they are headquartered here.”
The City by the Bay’s move to ban artificial intelligence (AI) enhanced face scanning was not without controversy. The San Francisco Police Officers Association argued that facial recognition could provide leads in ongoing criminal investigations.
What the Ban Does and Does Not Cover
San Francisco’s new prohibition on biometric security is not a holistic measure. It forbids the local police force and other government agencies from using facial recognition in their operations. However, the ordinance doesn’t contain any provisions for the private use of face scanning technology. That means Taylor Swift can bring her facial recognition system to her next Bay Area concert.
The city’s new policy will also have no impact on areas that fall under federal jurisdiction like the San Francisco International Airport. U.S. Customs and Border Protection (CBP) are currently installing face scanners in all of America’s major airports. As the CBP’s mandate allows it to install biometric security at all U.S. entry points, the agency may install a facial recognition system at the Port of San Francisco.
It’s worth noting that the new ordinance will not affect any ongoing San Francisco Police Department investigations. The S.F.P.D. did not have any biometric surveillance tools before the ban was approved.
Facial Recognition Across the Nation
Though the San Francisco government hopes to set a precedent with its ban, the ordinance’s potential impact is ambiguous.
City councils in Somerville, Massachusetts, and Oakland, California are currently considering local face scanning prohibitions. In April, Sen. Roy Blunt introduced a bill that would outlaw the commercial use of facial recognition tech. However, Blunt’s bill doesn’t address the use of government biometric security systems, which has become increasingly common in recent years.
Police in Maryland employed facial recognition tools to track down the Capital Gazette mass shooter. Oregon sheriffs used Amazon’s face-scanning technology to apprehend a petty thief. The Electronic Frontier Foundation reports authorities in Boston, Detroit, Las Vegas, New York City, and San Diego also utilize biometric programs to pursue criminals.
The Perils of Innovation
Privacy advocates have not approved of federal and local law enforcement’s use of biometric scanners. The American Civil Liberties Union (ACLU) has argued face-scanning tech gives the government too much power to monitor innocent citizens. Last month, a group of 26 researchers signed an open letter alleging Amazon’s Rekognition program has a dangerously high rate of false positives.
Furthermore, the ACLU performed a test in 2018 that found Rekognition falsely identified African-American and female members of Congress as criminals.
The implementation of biometric scanning has also been controversial in Britain. An English privacy group called Big Brother Watch is challenging the use of face scanning by U.K. police. The organization also publicized documentation noting the English biometric security program that found it had a 96 percent failure rate.
Western authorities have only used biometric security programs sparingly but they’ve been widely deployed in Asia. In China, facial recognition security programs have been brought online in several major cities. Beijing is also working to create a national biometric database containing information on 1.3 billion people. However, Chinese biometric surveillance technology has not been without its flaws.
Last year, a facial recognition program used by police in the city of Ningbo made a high-profile mistake. The system incorrectly identified appliance company CEO Dong Mingzhu as a jaywalker. Local police acknowledged the error and announced that its deep learning surveillance program would be overhauled.
Although there is little consensus among face scanning opponents and proponents, one thing is clear. If police departments and federal law enforcement agencies use it, regulation should be made directing its deployment. Nothing good comes of implementing new technological innovations without accountability.