Baroness Jones and the campaign group Big Brother Watch have issued proceedings in the High Court after claiming that the technology breaches human rights and signals a “slippery slope towards an Orwellian society”.
They are seeking permission for a judicial review of Scotland Yard’s use of automatic facial recognition software, which was found to be returning “false positives” in 98 per cent of alerts earlier this year.
Britain’s largest force said images that do not generate a potential match with police databases are deleted immediately and confirmed that anyone refusing to be scanned “will not be viewed as suspicious” ahead of a fresh trial in Stratford on Thursday.
The legal challenge, officially lodged against the home secretary, Sajid Javid, and the Metropolitan Police commissioner Cressida Dick, aims to stop police using the software.
It argues that the use of automatic facial recognition violates articles eight, 10 and 11 of the European Convention on Human Rights – guaranteeing the rights to private life, freedom of expression, assembly and association – and is neither proportionate nor necessary.
Silkie Carlo, director of Big Brother Watch, said it was subjecting “thousands of people in the area to highly sensitive identity checks without consent”.
“Facial recognition cameras are not only authoritarian, they’re dangerously inaccurate,” she added. “The use of this technology by the police risks taking us down a slippery slope towards an Orwellian society.”
Baroness Jones said: “This new form of surveillance lacks a legal basis, tramples over civil liberties, and it hasn’t been properly debated in parliament.
“The idea that citizens should all become walking ID cards is really the antithesis to democratic freedom.”
The information commissioner threatened legal action over facial recognition in May, calling it “intrusive” and demanding answers to questions over transparency, accuracy, bias, effectiveness and a lack of national coordination.
Liberty is backing a separate attempted challenge against South Wales Police by a Cardiff resident who believes his face was scanned at a peaceful anti-arms protest and while doing his Christmas shopping.
On Thursday cameras are to film the public for eight hours in Stratford, east London, in one of 10 upcoming trials to take place over the coming months.
A test carried out next to the Westfield Stratford shopping centre last month resulted in no arrests, and The Independent found that police officers were not giving out leaflets promised by commanders to alert passersby.
The vast majority of people walking past the cameras appeared not to see posters saying facial recognition technology was being used in the area.
Scotland Yard has again vowed that the “technology will be used overtly and information leaflets will be disseminated to the public” and said posters will be displayed for the new trial, which will be running for an extended period.
Police will be hunting for wanted violent criminals as they attempt to crack down on rising violence, with officers reviewing potential matches before deciding whether to move in.
Detective superintendent Bernie Galopin said the outcome of trials would be subject to an independent evaluation at the end of the year.
“The deployment of these cameras and targeting of individuals will be intelligence-led and temporary,” he added.
“Only images that come up as a match to a targeted individual will be retained for a limited period. The use of live facial recognition technology aims to support standard policing activity to ensure everyone’s safety.”
The London Policing Ethics Panel, which commissioned a public opinion survey on the issue, has found that the Metropolitan Police retains recordings of potential matches for 30 days for “technical assessment” before they are deleted.
A report published last week concluded that there was a “lack of clarity about the legal basis for the use of the technology and its regulation” and called for Scotland Yard to publish its view on its legality before any further trials.
The panel also said information should be found “quickly and easily” on the force’s website and told police to inform Londoners of the reason for trials, and consult them on where they will be carried out.
Dr Suzanne Shale, who chairs the London Policing Ethics Panel, called for them to be implemented before any further trials were launched, adding: “The technology is potentially of value for policing, but these trials have raised important questions about how citizens are involved in testing powerful new digital technologies, and in subsequent decisions whether to adopt them.”
The Independent could not find any evidence of the recommendations being carried out.
Commander Ivan Balhatchet said a "comprehensive legal framework" would be published on the force’s website within the next fortnight.
"The new webpage will also provide information about why the Met is trialling the technology, where and when it has been used and how we will engage with Londoners during the deployments," he added.
The locations of the trials change to test the technology in a variety of different scenarios and deployments do not focus on particular communities, police said.
Representatives from the Mayor’s Office for Policing and Crime (MOPAC) and the London Ethics Panel have been invited to observe the next deployment.
Senior officers say facial recognition will be “an extremely valuable tool to help keep London and its citizens safe” and have welcomed the prospect of dedicated legal and ethical frameworks.
Opponents argue that the software currently being used by British police forces is “staggeringly inaccurate” and has a chilling effect on society, while supporters see it as a powerful public protection tool with the ability to help track terrorists, wanted criminals and vulnerable people.
The Metropolitan Police has previously used facial recognition at Notting Hill Carnival, Remembrance Day services and at the Port of Hull, while South Wales Police used it at events including the Champions League final.
The government has announced the creation of a new oversight and advisory board for facial recognition in law enforcement, which could be expanded to ports, airports, custody suites and on police mobile devices.
The Home Office’s strategy on biometrics, which also include fingerprints and DNA, said the board would make recommendations on policy changes and oversight arrangements for the technology, which is currently being purchased ad hoc by police forces.
The Independent
More about: Facialrecognition