Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Data privacy campaigners are urging the UK government to clamp down on facial recognition and pass new legislation to regulate the widespread technology used by both police forces and the private sector.
A new report published on Thursday by the Ada Lovelace Institute, an independent researcher into the ethics of data and artificial intelligence, warned of “significant gaps and fragmentation across biometrics governance”.
The group called on Sir Keir Starmer’s government to provide “clarity on the limits, lawfulness and proportionality of biometric systems” and create a new regulator to enforce stricter rules.
Privacy campaigners argue that the absence of clear rules around “live” face scanning systems — capable of immediately matching images to a database of individuals — has made the UK a “wild west”. At the same time, new developments in artificial intelligence threaten to make the technology even more powerful, affordable and widespread.
Almost 5mn faces were scanned by police forces in the UK last year, leading to more than 600 arrests, according to police records compiled by Liberty Investigates, the investigation arm of the human rights advocacy group. The technology is also being installed in shops and sports stadiums.
However, the legality of such deployments is in “serious question”, according to Ada Lovelace researchers. The UK appeals court ruled in 2020 that South Wales Police’s use of facial recognition technology broke privacy and data protection laws.
The body also warned of “fundamental deficiencies” in the UK’s legal framework.
“The highly fragmented nature of biometrics governance makes it very difficult to know if police use of [facial recognition technology] is lawful,” Thursday’s report said.
It added that a “new generation of technologies claims to infer a person’s emotions, intentions, attention, truthfulness and other internal human states. The ability to appropriately manage the risks of these technologies has not matured alongside this growing appetite for use.”
Sarah Simms, senior policy officer at Privacy International, said this “legislative void” had made the UK an “outlier” in oversight of the technology.
While arguing that the technology is already governed by human rights and data protection laws, UK policing minister Dame Diana Johnson said this month that she “fully accept[s] . . . that there is a need to consider whether a bespoke legislative framework governing the use of live facial recognition technology for law enforcement purposes is needed”.
Johnson added she would outline the government’s plans “in the coming months”.
During a parliamentary debate in November last year, she acknowledged “very legitimate concerns”, while also hailing the technology as “transformational for policing”.
Retail stores have also increased their take-up of facial recognition cameras following a sharp rise in shoplifting and assaults on shop staff.
Southern Co-op, Budgens and Sports Direct all employ the technology at various stores throughout the UK. This spring the grocery store chain Asda launched a trial of the cameras at five locations in Greater Manchester.
Most retailers insist that facial recognition capabilities are limited to identifying criminals whose biometric data is already available. A number of businesses also share CCTV footage with police nationally through a data-sharing arrangement known as Project Pegasus, which allows police to cross-reference images against a facial recognition database.
Critics said its use in public spaces interfered with people’s right to protest and had already resulted in the misidentification of innocent people as shoplifters.
“Live facial recognition is extremely invasive,” said Simms. “It does require specific safeguards due to the nature of how the technology operates and the implications for people’s human rights.”
Charlie Welton, a policy and campaigns officer at Liberty, said: “The UK is massively behind in regulating facial recognition technology, especially compared to Europe and the US where limits have already been put in place.”
The EU’s AI Act and several US states have banned many applications of live facial recognition systems.
“We’re in a situation where we’ve got analogue laws in a digital age,” Welton added.
The Home Office said: “Facial recognition is an important tool in modern policing that can identify offenders more quickly and accurately.”