By Ceren Sagir
A Metropolitan Police scheme to snoop on Londoners using live facial-recognition (LFR) technology is “dangerous” and a “threat to human rights,” privacy campaigners warned today.
Warnings flooded in after the Met announced it would begin deploying the technology, which has failed multiple trials, across the capital’s streets and claimed the measure would help fight serious crime.
LFR technology uses special cameras to scan the structure of faces in a crowd. It then creates a digital image and compares the result against a “watch list.” If the cameras identify a person of interest they will be approached by officers.
Big Brother Watch (BBW) executive director Silkie Carlo said the move represents “an enormous expansion of the surveillance state and a serious threat to civil liberties.”
Ms Carlo said: “It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81 per cent inaccurate.
“This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary.
“This move instantly stains the new government’s human-rights record and we urge an immediate reconsideration.”
Liberty’s Clare Collier called the decision a “dangerous, oppressive and completely unjustified move.”
She said: “Facial-recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression.
“Rolling out an oppressive mass-surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step, pushing us towards a surveillance state in which our freedom to live our lives free from state interference no longer exists.”
Police in London will start using the LFR cameras “within a month,” The Met said.
Green Party co-leader Sian Berry called the deployment, without any shifts in the legal or ethical framework and after failed trials, a “shoddy move” by the Met.
And Amnesty International warned that the tech “poses a huge threat to human rights,” including the rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly.
Runnymede Trust also highlighted research which shows the technology frequently misidentified black people and women.
The Information Commissioner’s Office said the tech has “potentially significant privacy implications” and reiterated its call on the government to introduce a statutory and binding code of practice “as a matter of priority.”
The Met has used the technology multiple times, including at Notting Hill Carnival in 2016 and 2017, Remembrance Day in 2017, and Port of Hull docks, assisting Humberside Police, in 2018.
Trials also took place in locations including Westfield shopping centre in Stratford and the West End.
The force claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 instances.
However last year, using a different metric, University of Essex researchers said only eight correct matches out of 42 were made across six trials they evaluated.
Scotland Yard said that the public will be aware they are under surveillance, with officers handing out leaflets.
Assistant Commissioner Nick Ephgrave said the force is “in the business of policing by consent” and believes it is effectively balancing individuals’ right to privacy with crime prevention.
In September, a High Court ruling said the use of LFR by South Wales Police had not been unlawful after an activist argued that having his face scanned caused him “distress” and violated his privacy and data-protection rights.
Ed Bridges, from Cardiff, brought the challenge after claiming his face was scanned while he was doing his Christmas shopping in 2017 and at a peaceful anti-arms protest in 2018.
Mr Bridges said he would appeal against the decision, and the appeal is due to be heard in June. (IPA Service)
Courtesy: Morning Star