London’s Metropolitan police have recently been trialling facial recognition technology in some of the busiest areas across the UK capital.
The trial took place on December 17 and 18 in Leicester Square, Picadilly Circus and Soho. But, in an unusual twist, individuals could apparently ‘opt-out’ of the trial, with the understanding that they wouldn’t be treated with suspicion if this was requested.
This isn’t the first facial recognition trial in London. Similar operations have taken place at consecutive Notting Hill Carnivals, Remembrance Day services in 2017 and near Stratford in June and July of this year.
So, can we expect to see more facial recognition policing tech in Britain? How does the opt-out system work? And will this keep us safer, or has the long arm of the law grown too long?
What is London Trialling with Facial Recognition?
Facial recognition tech isn’t that new, and neither is its application in policing. In fact, it’s already being used across the US and in other areas around Britain. As with most emerging tech, it relies heavily on artificial intelligence (AR) and machine-learning to be effective.
AI and machine-learning algorithms are trained to detect faces from live camera feeds. These faces are then, in the case of the Met’s trial, stored temporarily and checked against a database that contains a “watch list” of “offenders wanted by the police and courts for various offences.” If the computer detects a match, it should alert nearby officers and enable them to make a swift arrest.
However, speaking to officers in Leicester Square, it became clear that the software being used is far from perfect. One of the officers noted how the software was not 100% accurate, and they would need to apprehend suspected matches and “double-check” before making an arrest.
This certainly corroborates findings by UK-based privacy group Big Brother Watch, whose May 2018 report, “Face Off”, noted that previous Met trials had a 98.1% false positive rate — meaning that less than two percent of the matches found by the software were actually matches.
The actual operation consisted of a nondescript van parked in Leicester Square outside the Hippodrome Casino, with some small cameras perched on top. There were two small signs advertising the trial, one tied to a lamppost on Charing Cross Road and one next to the van. A handful of police officers were also in Leicester Square, but this itself is far from uncommon.
Most people, whether they were going to work, checking out the wonders of the M&Ms World, or finishing off their Christmas shopping, seemed completely unaware of the groundbreaking technology being trialled alongside them.
Opting Out of Facial Recognition
One of the interesting twists in this trial was that anyone could ‘opt out’ of the trial and decline to be scanned.
Anyone declining to opt out would also “not be viewed as suspicious by police officers,” according to a Met press release, as “there must be additional information available to support such a view.”
On the ground, however, there was no mention of opting-out. It was not explained on either sign that we saw, nor by the police officers we spoke to. Even on the informational flyers distributed by police, which the officers had “run out of” when we spoke to them, there was no mention of opting-out.
However, if you visited the web page mentioned on the flyers and signs by typing “www.met.police.uk/advice/advice-and-information/facial-recognition/live-facial -recognition-trial/” into your browser, you would find a mention of the opt out policy in the seventh paragraph.
There was no description of how to actually opt out and you cannot navigate to the page directly from the Met Police’s homepage. Needless to say, this hardly constitutes a user-friendly path for understanding your rights to opt out, nor for enforcing them.
The Future of Facial Recognition
CCTV cameras are a common sight across the UK, with an estimated 500,000 cameras operating in London alone. Mass surveillance itself is nothing new to British society.
Facial recognition technology, however, has the potential to significantly augment the power of the existing surveillance apparatus in the UK. It would make it easier to spot wanted criminals, but will inevitably also keep an eye on ordinary people with no previous run-ins with the police.
However, if its current inaccuracy is at the level Big Brother Watch claims, then it is hard to see the utility in operating costly and time-consuming trials. Innocent people have already been affected by this tech, with police staging interventions on over 30 innocent people as of May this year, on the basis of a false positive match in previous deployments.
Also concerning is whether ordinary people, with no criminal convictions or history with the police will actually be able to opt-out if this technology becomes widespread. In this trial, the Met stated that:
“The system will only keep faces matching the watch list, these are kept for 30 days, all others are deleted immediately. We delete all other data on the watch list and the footage we record.”
But when the average Londoner is caught on camera 300 times per day, this would provide an almost constantly updating database of innocent people.
If used correctly, this technology has fantastic potential to ensure public safety. But, in a time of growing uncertainty over state surveillance, plus with the current inefficiency of the tech itself, an open debate about the rollout of police-led facial recognition campaigns feels essential.
Read more about facial recognition on Tech.co