Critics of facial recognition took a big hit today, as London's Metropolitan Police service has announced they will begin using Live Facial Recognition (LFR) technology on citizens in the massive European city.
As perhaps one of the most controversial technologies in recent memory, debate about the use of facial recognition to police citizens has been tumultuous to say the least. From citywide condemnations to proposed EU bans, lawmakers are torn between the functionality of new tech and a commitment to privacy.
Now, with the Met announcing its latest decision to fully utilize LFR throughout the city, it's safe to say the mainstream age of facial recognition technology is upon use.
Met Police Announcement
In a statement released today, London's Metropolitan Police announced it will “begin the operational use of Live Facial Recognition technology,” specifically to help “tackle serious crime, including serious violence, gun and knife crime, and child sexual exploitation.” And of course, they have to mention how safe they were going to keep everyone and how much everyone supports this decision.
“This is an important development for the Met and one which is vital in assisting us in bearing down on violence,” said Nick Ephgrave, Assistant Commissioner at the Met, in the statement. “As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard.”
The Met claims that the LFR system — which will be fully operational, rather than on a trial basis — is 70% effective at identifying subjects, and hardly ever falsely identifies anyone. Honestly, those numbers don't sound great already, and they sound even worse when you consider the person who conducted the only independent review of the system says it was a lot closer to 19%.
“I stand by our findings,” said Peter Fussey, a surveillance expert at Oxford University, the Guardian. “I don't know how they get to 70%.”
While it's certainly one of the larger cities to implement this kind of technology into their police system, London is not the first. South Wales has been using it for less than a year, after a number of court cases opened the doors for its use. Unfortunately, that didn't stop people from responding to the news with a lot of reasonable negativity.
The Swift and Brutal Response
As you'd expect, the response to the Met's decision was as passionate as it was pointed. Critics called the use of facial recognition by the London police service everything from “a huge threat to human rights,” to “a dangerous, oppressive and completely unjustified move.” And they certainly didn't stop at name calling.
“This technology puts many human rights at risk, including the rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly,” said Allan Hogarth, from Amnesty International UK. “This is no time to experiment with this powerful technology that is being used without adequate transparency, oversight and accountability.”
Despite the notable backlash, the Met is moving forward with this decision. And while the citywide support they claim to have might be suspect, having the mayor on your side certainly doesn't hurt your cause. That is, as long as they utilize the technology correctly and morally.
“New technology has a role in keeping Londoners safe, but it’s equally important that the Met are proportionate in the way it is deployed and are transparent about where and when it is used in order to retain the trust of all Londoners,” said Sadiq Khan, the Mayor of London. “City Hall and the Ethics Panel will continue to monitor the use of facial recognition technology as part of their role in holding the Met to account.”
Criticisms of the accuracy and morality of using facial recognition technology to monitor citizens are more than justified. The reality is though, if you've got the mayor on your side, all the backlash in the world isn't going to stop you, even if that's exactly what's happening.
Global Facial Recognition Backlash
London might be on board with facial recognition technology, but they don't have a lot of company. While the city has been a bit more prone to surveillance the others in recent years, the global backlash to the development of the controversial tech has been substantial, and it's taken a lot of forms.
San Francisco has issued a citywide ban on the technology. Amazon shareholders stopped the sale of the company's Rekognition technology to law enforcement agencies. Even the EU is considering a temporary ban on all facial recognition software to catch up to the fast evolving tech, which after this announcement, would make some serious waves amongst the Brexit decision.
Even if support for facial recognition technology use in law enforcement is high — which, in the US, it surprisingly is — it's hard to argue that it's accurate enough to be utilizing it outside of a trial basis. Studies have shown the tech to be racist, sexist, and generally wrong to an embarrassing level for anything being used to decide who goes to jail and who doesn't. And if you disagree, just wait until you're one of the people being falsely identified.