Britain’s biometrics watchdogs have warned that national oversight of AI-powered face scanning to catch criminals is lagging far behind the technology’s rapid growth.
With the Metropolitan police almost doubling the number of faces they scan in London over the past 12 months and a rising use of the technology by retailers in the UK, Prof William Webster, the biometrics commissioner for England and Wales, said the “slow pace of legislation was trying to catch up with the real world” and “the horse had gone before the cart”.
Dr Brian Plastow, who holds the same role in Scotland, warned the technology was “nowhere near as effective as the police claim it is” and said there was a “patchwork legal framework” throughout the UK. He said in England and Wales, police were “really just marking their own homework”.
The watchdogs said new laws were needed to govern when and how police forces used live facial recognition technology, with a new regulator to clamp down on misuse.
Several bodies have oversight of the technology, including the Information Commissioner’s Office (ICO) and the Equality and Human Rights Commission.
The Home Office is considering a new legal framework for the technology as it also plans to introduce nationally what it calls “the biggest breakthrough for catching criminals since DNA matching”.
Members of the public wrongly labelled as suspected criminals by shops using AI cameras said there was no accountability or recourse to complain. They said the system had left them feeling “guilty until proven innocent”.
They described the ICO, which is responsible for monitoring facial recognition tech and the biometric data it uses, as “toothless” and unresponsive.
British police forces and high street retailers claim the technology makes streets safer, but others criticise it as Big Brother-style mass surveillance, with risks for civil liberties and data privacy.
So far this year the Met has scanned more than 1.7 million faces in London hunting for suspects on watchlists, up 87% on the same period in 2025.
It has also emerged:
- An independent audit of the Met’s use of facial recognition technology (FRT) has been indefinitely postponed after the police requested delays.
- Polling shows 57% of people believe the systems are “another step towards turning the UK into a surveillance society”.
- A whistleblower claimed shop-based face-scanning systems had sometimes been misused by shop or security staff “maliciously” adding members of the public to watchlists.
Webster said: “We could be talking three years, at a minimum, before regulation is in place and active. And we already have a rollout of live face recognition in a dozen different police forces.
“The technology is becoming cheaper and cheaper, and in time we will see it everywhere, including in the static surveillance camera network.”
In February, The Guardian revealed how police arrested a man for a burglary in a city he had never visited after face-scanning software deployed across the UK confused him with another person of south Asian heritage.
Several other people have told The Guardian about the impact of being misidentified by face-scanning software increasingly used by retailers to fight shoplifting.
Further concern about limited scrutiny of the fast-developing technology has been caused by the postponement of the ICO’s planned audit of the Met’s use of AI-powered face scanning to find wanted criminals.
The ICO, which is the UK’s data regulator, had scheduled the investigation for October last year. But the Met asked for it to be pushed back and it is no longer certain it will go ahead, according to emails obtained by The Guardian under the Freedom of Information Act.
They show the Met cited as reasons for delay its need to handle a legal challenge to its face-scanning policy, about which a court ruled in its favour last week, officers taking Christmas leave and the burden of policing new year festivities.
The ICO accepted its request and the investigation is no longer certain to go ahead, prompting claims the regulator is being “insufficiently aggressive”.
David Davis MP, the former shadow home secretary and a civil liberties campaigner, said: “[FRT] is a massive development with all sorts of implications. The ICO should be the defender of the ordinary citizen and should be far more aggressive in what it does.”
The ICO and the Met said the timing of the judicial review meant it was appropriate to postpone the proposed audit.
The Met said: “We have always been transparent about our use of facial recognition technology and welcome independent scrutiny.” The ICO said it was reviewing whether the audit was rescheduled.
Polling of 2,000 adults last month by Opinium found that nearly a third opposed the use of facial recognition by retailers. In addition, 62% worried about the technology getting people into trouble for things they had not done, according to the poll, commissioned by Face Int, a biometric security company.
Face-scanning software is being increasingly used by retail chains to target shoplifters and antisocial and violent behaviour in stores. Sainsbury’s, Budgens and Sports Direct are among the chains using Facewatch in some shops.
The technology analyses CCTV footage and compares faces against a private database of known offenders, alerting staff when a match is made.
Big Brother Watch, a civil liberties campaign group, said it had been contacted by 21 people during the past year who believed they had been wrongly placed on watchlists or misidentified.
Ian Clayton, a retired health and safety professional from Chester, was asked to leave Home Bargains in February after being told he had been flagged on a facial recognition system as a thief. He later found out he had been wrongly associated with a shoplifter he had happened to stand next to on a previous visit.
“It feels very Orwellian,” he said. “We’re constantly being recorded and put on these systems but should we be there? It feels like spying without cause. It left me feeling vulnerable, exposed and a little bit helpless. I’m hyper-aware of cameras now.”
The same thing happened to Warren Rajah, a data strategist in south London, on a visit to Sainsbury’s. “This is a civil rights issue that we are slow-waltzing into,” he said. “We know cameras cannot pick up features of people that have darker features with as much accuracy.”
Meanwhile, a whistleblower has claimed the systems have sometimes been misused by shop or security staff “maliciously” adding members of the public to watchlists even though they have not been caught doing anything wrong.
Paul Fyfe, a former security guard who worked using Facewatch cameras in Stockton-on-Tees until last September, said in some cases staff had tagged members of the public on watchlists even when they had not been caught shoplifting or committing violence.
“If you’ve got someone there that you’re pissed off with, that you can’t catch or you’re getting chew off [being hassled] or they are threatening you, the easiest way to harm them is to upload them on the system,” he said. “[On] 10 to 15 occasions, I know people have been tagged for malicious reasons.”
The result was that security guards in other stores with the same software would be alerted whenever they entered.
Facewatch’s CEO, Nick Fisher, said: “We do not recognise the claims that the incident reporting system is being misused, including the serious allegation that individuals are being added maliciously.
“The system has been purposely designed not to allow misuse, and we have strict rules governing how the system can be used, with safeguards and controls built in. Retailers must meet clear evidential standards before submitting a record, and every submission is subject to human review before any individual is added to the database. If a submission does not meet the required standard, it is rejected and returned to the retailer.”
And Jessica Murray also writes:
When Ian Clayton, a retired health and safety professional from Chester, popped into Home Bargains one February lunchtime, he was suddenly approached by a stern-looking member of staff.
“Excuse me, can you please put everything down and leave the shop now?” she said. Clayton recalled how he was stunned, and it was only as he was briskly walked past the tills towards the exit that he stopped to ask what he had done.
“You’ve come up on our system called Facewatch as a shoplifter,” came the reply. “There’s a poster in the window.” With that, he was left outside the shop alone, with a QR code to scan and no idea what had happened.
He is one of a number of people who have spoken to The Guardian after being falsely identified as a thief by shops using Facewatch, a live facial recognition system being rolled out across the UK to clamp down on retail crime.
The company’s website claims that its system has a 99.98% accuracy rate and that last month it sent 50,288 alerts of “known offenders” to shops including B&M, Home Bargains, Sports Direct, Farm Foods and Spar, which all now use the software.
But those who have been wrongly identified and forced to leave shops, either via the technology itself or human error, say they were given no support, and did not know how to complain about their treatment or prove their innocence.
Clayton, 67, said that after he was ejected from Home Bargains he tried calling a phone number on a Facewatch poster, and was sent through to a message saying the company did not take calls and he had to send an email instead.
He was only able to get answers after submitting a subject access request – a formal request under data protection laws for personal information – that revealed he had been incorrectly associated with a shoplifting incident on a previous visit to the shop.
“It was like I was guilty until proven innocent. It’s an awful feeling. It leaves a pit in your stomach and when I look back now I can feel it again,” he said.
“It feels very Orwellian. We’re constantly being recorded and put on these systems but should we be there? It feels like spying without cause. I’m hyper aware of cameras everywhere now, I’m so aware of them.”
Home Bargains eventually issued him an apology and a £100 voucher as a “gesture of goodwill without admission”, on the condition that the details of the incident remain confidential. Clayton declined: “I just thought: ‘Really, you’re trying to buy my silence?’”
As facial recognition spreads across police forces and retail stores, UK biometrics commissioners are warning that national oversight is lagging far behind the technology’s rapid expansion.
Last year, the Home Office admitted facial recognition cameras were more likely to incorrectly identify black and Asian people than their white counterparts, and women more than men, and there have been conflicting studies on their overall accuracy.
“For me, this is a civil rights issue that we are slow-waltzing into because if you are just removed without question, your civil rights are being impacted,” he said.
“We already live in a country that has issues with racism, it’s an unavoidable issue. And we know cameras cannot pick up features of people that have darker features with as much accuracy. And this could be happening to people who are much more vulnerable than me.”
He said he had major concerns about this technology being rolled out in police forces, as well as in the retail sector.
“Who is regulating these companies and can they be trusted with our information? And more importantly, no one has actually defined what your recourse is when something goes wrong,” he said.
After countless emails, he eventually found out he was not on the Facewatch database system and staff members had misidentified him. He was offered a £75 voucher as an apology – when he said he did not feel comfortable returning to the store, he was told to use it online.
Jennie Sanders, 48, from Birmingham, was browsing in B&M on a Saturday afternoon last year when a security guard told her she had been flagged up on the Facewatch system and he had to escort her around the store to check she was not stealing.
“I was really upset. It was in front of loads of people, and I was really embarrassed. I said I wanted to leave and he escorted me out of the shop,” she said.
“It was scary but what was more scary was when I got home and started looking into Facewatch, I saw they share the information between loads of retailers.
“I thought: ‘I’m going to be treated like a shoplifter in every store. I’m not going to be able to do any shopping in person ever again.’”
She was told she had to send a copy of her passport to Facewatch to prove her identity before she could find out that she was on the system for stealing a bottle of wine from B&M, which she said never happened.
B&M told her it no longer had any evidence, including CCTV footage from the day, so she was taken off the system and offered a £25 voucher.
“I took a couple of days off work, I was absolutely beside myself. Why was I on a database of criminals without my knowledge?” she said. “I’m never going into B&M again. I try to stay away from places with cameras at all – it has really affected me.” Sanders said she complained to the Information Commissioner’s Office (ICO), the formal watchdog monitoring how personal information is being used in facial recognition technology, but seven months later she had yet to hear back.
She added: “We’re told to raise complaints and send all correspondence to the information commissioner, but they don’t get back to you. What the hell is happening with any sort of response to the victims of this?”
Rajah had also considered complaining to the ICO, but could find no information on how to do so.
“They are so toothless,” he said. “And this issue has been well reported, and they haven’t publicised a formal complaints process. Where’s that information? How can you complain when there are no avenues to follow?”
A Sainsbury’s spokesperson said: “We have sincerely apologised to Mr Rajah for his experience in our Elephant and Castle store. This was not an issue with the facial recognition technology in use but a case of the wrong person being approached in store.
“The Facewatch system has a 99.98% accuracy rate and all matches are reviewed by trained managers, with additional training provided after this incident to ensure our safeguards are consistently followed.”
Nick Fisher, the chief executive of Facewatch, said: “We are aware of the matters referenced and in each case, we acted promptly once they contacted the Facewatch data protection team.
“These cases relate to human error in the way processes were carried out in-store, rather than any failure of Facewatch’s technology. We are sorry these individuals experienced being challenged while shopping and understand why this would have been upsetting.
“These three errors are extremely rare cases when viewed in the context of the more than 500,000 alerts we send to retailers each year, but we recognise that any mistake is upsetting for the individual concerned. The system is designed to support, not replace, human decision-making.”
A spokesperson for the ICO said: “We recognise the harm and upset that can be caused by misidentification. For this reason, use of facial recognition technology must strictly comply with data protection law and be handled with care and transparency.
“If someone has concerns about how their data has been collected, used, or shared, and those concerns cannot be resolved with the retailer directly, they have the right to raise a complaint with us.
“We also continue to actively regulate in this area and will be publishing further retail‑focused guidance to support retailers in understanding and meeting their data protection obligations, while ensuring the public is properly protected.”
Home Bargains and B&M declined to comment.
Don't worry it'll never work, British governments are rubbish at big tech.
ReplyDeleteBut an awful lot of damage will have been done in the meantime.
Delete