
I completely understand why people feel strongly about the use of facial recognition software in public places. As digital technology becomes increasingly prevalent in our day-to-day lives, it is critical that it is not misused and our privacy is protected.
The use of biometric data (including facial images) by private companies to identify individuals is already regulated by the General Data Protection Regulations and the Data Protection Act 2018. Under the legislation, data processing must be fair, lawful and transparent. In addition, individuals who believe their data has been misused can make a complaint to the Information Commissioner’s Office (ICO), the independent regulator of the legislation.
In June 2021, the ICO published its opinion on the use of facial recognition software and determined that controllers of such software in public places must comply with relevant GDPR requirements. This includes identifying a specified, explicit and legitimate purpose for using this software in a public place, ensuring a valid lawful basis, and considering alternative measures while demonstrating that they cannot reasonably achieve their purpose by using a less intrusive means.
The ICO continues to keep a watching eye on companies who fail to observe relevant data protection regulation when it comes to facial recognition. For example, in May 2022, the ICO fined Clearview AI over £7.5 million for using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition.
Please be assured I will be following developments in this area closely.
Regarding police use of facial recognition, the Government should be committed to ensuring the police have the tools and technology they need to solve and prevent crimes, bring offenders to justice, and keep people safe. Technology such as facial recognition can help the police quickly and accurately identify those wanted for serious crimes, as well as locate missing or vulnerable individuals. This technology has already assisted in catching criminals, including murderers and rapists, more swiftly and accurately. In the debate held on the 13th of November, Chris Philp MP, the Shadow Home Secretary explained,
‘live facial recognition starts with a watchlist of people who are wanted by the police. It is not the case that anyone can get on that watchlist, which generally comprises people who are wanted for criminal offences—often very serious offences—people who have failed to attend court, and people who are registered sex offenders, where the police want to check that they are complying with their conditions…The vast majority of people are not on the watchlist, as we would expect, and their image is immediately and automatically deleted. Where a person is on the watchlist, the police will stop them and ask if they have any form of identification.’
Dame Diana Johnson, the Minister for Policing, Fire and Crime Prevention stated that between January and November 2024, the Metropolitan police,
‘made over 460 arrests as a result of live facial recognition deployments, including for offences such as rape, domestic abuse, knife crime and violent robbery. In addition, over 45 registered sex offenders have been arrested for breaching their conditions. South Wales police tell me that between January and November, they deployed live facial recognition locally on 20 occasions, resulting in 12 arrests. They also located a high-risk missing young girl, who they were able to safeguard from child sexual exploitation and criminal exploitation.’
It frees up police time and resources, allowing more officers to be out on the beat, engaging with communities, and carrying out complex investigations.
As I understand it, currently only the Metropolitan Police Service and South Wales Police have a permanent live facial recognition (LFR) capability. Where a live facial recognition system does not generate an alert, the person's biometric data is immediately and automatically deleted by the LFR software. The LFR watchlist is deleted as soon as reasonably practicable, and in any case, within 24 hours following the conclusion of the deployment. I was particularly interested to learn about the high level of accuracy this technology can now provide across different demographic groups and is,
‘4,500 times less likely to result in someone being inappropriately stopped than a regular stop and search’
and about the safeguards that are already in place. For example,
‘facial recognition is covered by data protection, equality and human rights law as well as common law powers and detailed guidance from the College of Policing.’
‘the police do not keep the biometric data of people filmed during live facial recognition deployments, that watchlists are bespoke and that the police deploy the technology only when there is an intelligence case for doing that. I have also been assured that there will always be a human being in the loop to decide whether to apprehend someone.’
However, I understand and share concerns about its use going forward - it is right to ensure that any new technological developments are rigorously tested and not used beyond what is necessary. I am aware that the Court of Appeal in the Bridges case in 2020 found that there is a legal framework for police to use LFR. However, it also found that South Wales Police did not fully comply with privacy, data protection, and equality laws during two of their LFR pilots. Since then, the police have said that they have addressed the Court’s findings.
Dame Diana Johnson MP, speaking for the Labour Government, finished by saying although the roll out of live facial recognition vans equipped to carry out this work is ongoing,
‘ facial recognition technology is a powerful tool. In considering its current and future use, we must balance privacy concerns with the expectation that we place on the police to keep our streets safe. We particularly need to consider how much support the police may require from Government and Parliament to set and manage the rules for using technologies such as facial recognition. We must think about how we protect the public from potential misuse of those technologies, and we need to consider how the application of the rules and regulations is scrutinised.
I am therefore committed to a programme of engagement in the coming months to inform that thinking.’
I will continue to press the Government to ensure that the police use this technology appropriately, that there are sufficient safeguards in place and I look forward to the publication of any reports or recommendations from the Minister.
I have provided a link to the debate transcript as you may be interested to read the full debate: