Law enforcement officials in London are taking a big step forward with the use of live facial recognition (LFR). On Friday, the city’s Metropolitan Police announced via a press release that the technology will be employed to help track down criminal suspects.
Of course, not everyone is on board with the idea. Privacy continues to be a major concern and topic for discussion surrounding the use of facial recognition. Despite the agency’s reassurances to the public that their privacy will be protected, many still remain wary.
Pushing the Boundaries
The decision to start using live facial recognition didn’t happen overnight. Rather, the Metropolitan Police have been testing the technology to determine its effectiveness for some time.
The agency plans to deploy a network of cameras in popular locations around the city. They will continuously scan the faces of individuals in the crowds for five- to six-hour intervals.
Nick Ephgrave, the Metropolitan Police assistant commissioner, says, “We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point.”
To that point, it would be interesting to see the data behind these claims. In May 2019, The Burn-In reported on a statement from the Metropolitan Police which said that facial recognition systems used by law enforcement agencies are inaccurate up to 96 percent of the time.
Whether or not those errors have been fixed is unclear. Though the statistics aren’t confirmed by a third-party, the agency says that its algorithms now only generate one false alert for every 1,000 cases.
Ephgrave goes on to add, “Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic.”
The surveillance system will attempt to match faces on the street to those of individuals on the agency’s special watch lists of suspects “wanted for serious and violent offenses.”
It’s worth noting that a positive facial recognition match won’t be enough to take someone into custody. Rather, if a match is found, officers will approach the individual in person and ask them to confirm their identity. If, of course, they are the suspect in question, they will then be arrested. Still, not depending on facial recognition alone is a good thing.
The agency says in a press release, “This is a system which simply gives police officers a ‘prompt,’ suggesting ‘that person over there may be the person you’re looking for.’”
In the meantime, the public is on edge about the new development. After all, having surveillance cameras armed with facial recognition technology in use in public spaces is somewhat unnerving. Even for those with nothing to hide, the concept feels a bit dystopian.
To counter this, the Metropolitan Police say that photos of people that aren’t matched to a suspect will be deleted. Nonetheless, people are rallying against the new form of crime detection. Silkie Carlo, director of Big Brother Watch, says, “This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.”
Yet, the program will probably be fairly successful at tracking down criminals. If so, it won’t take long for other law enforcement agencies, including those in the U.S., to follow suit.