The London police will begin using facial recognition cameras to select suspects from crowds on the street in real time, in a breakthrough of the controversial technology that raises concerns about automated surveillance and the erosion of privacy rights.RELATED
The Metropolitan Police Service said Friday that after a series of trials, the cameras will be put to work within a month in operational deployments of around 5-6 hours at possible critical points of the crime. The locations would be chosen based on intelligence, but police did not say where, how many places or how many cameras would be deployed.
Real-time crowd surveillance by the British police is one of the most aggressive uses of facial recognition in rich democracies and raises questions about how technology will enter people's daily lives. Authorities and private companies are eager to use facial recognition, but human rights groups say it threatens civil liberties and represents an expansion of surveillance.
London's decision to use technology defies warnings from rights groups, legislators and independent experts, said Amnesty International researcher Anna Bacciarelli.
"Facial recognition technology poses a major threat to human rights, including rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly," said Bacciarelli.
London police said the facial recognition system, which works with Japan's NEC technology, looks for faces in crowds to see if they match any on the "watch lists" of up to 2,500 people wanted for serious and violent crimes, including crimes with weapons and knives and sexual exploitation of minors.
"As a modern police force, I think we have a duty to use new technologies to keep people safe in London," assistant commissioner Nick Ephgrave said in a statement.
The British have long become accustomed to video surveillance, with cameras used in public spaces for decades by security forces fighting terrorist threats. Real-time monitoring will test that tolerance.
London is the sixth most monitored city in the world, with almost 628,000 surveillance cameras, according to a Comparitech report.
The London measure comes after a ruling by the British Supreme Court last year authorized a similar deployment by the South Wales police, which has been using it since 2017 to monitor major events such as soccer games, royal visits and air shows. That system deleted people's biometric data automatically after scanning.
Britain's privacy commissioner Elizabeth Denham, who had warned the police not to make that decision as a general approval, addressed in a cautious tone on Friday.
She said that while the London police have stated that they are implementing safeguards and transparency to protect privacy and human rights, "it is difficult to comment further on this until we have a real deployment and can analyze the details of it."
The signs will warn passersby about the cameras and officers will hand out leaflets with more information, police said, adding that the system is not linked to any other surveillance system.
The London police previously conducted a series of test deployments that, they say, identified 7 out of 10 wanted suspects who passed in front of the camera while only incorrectly pointing to 1 in 1,000 people. But an independent review conducted last year by professors from the University of Essex questioned that, saying the trials raised concerns about their legal basis and team accuracy, with only 8 of 42 matches verified as correct.
Pete Fussey, a professor at the University of Essex who co-authored the report, said NEC has updated its algorithm since then, but there is evidence that the technology is not 100% accurate, which points to a recent test by the government’s laboratory USA of almost 200 algorithms that most found have ethnic prejudices.
"If you are using the algorithm, you should consider its shortcomings," he said. "It is very unlikely that the NEC algorithm will be effective in all ethnic categories."