Deployment of live facial recognition not a 'fishing exercise'
Lindsey Chiswick explained how the Met is deploying LFR during an evidence session with the Science, Innovation and Technology (SIT) committee.
Increased arrests are not the only benefit of live facial recognition (LFR) according to the Met's Director of Intelligence, who says its recent deployment at the Coronation is "one example" of the technology also being an effective deterrent.
Lindsey Chiswick gave evidence to the SIT committee on Wednesday alongside Dr Tony Mansfield, the Principal Research Scientist from the National Physical Laboratory (NPL) whose study - published in April - found there to be a 'substantial improvement' in the accuracy of LFR.
Both the Met and South Wales Police (SWP) deploy the technology.
The latter resumed its use after the NPL study was published, having previously stopped after the Court of Appeal ruled in 2020 that its use was unlawful.
Reservations remain despite this improvement, something Ms Chiswick acknowledged to the committee.
"I completely understand public concern around live facial recognition technology and AI...what I've tried to do with the introduction of facial recognition technology in the Met is do so in a careful, proportionate and as 'transparent as we possibly can be' way."
She stressed that the technology is not used as part of any "fishing exercise", but to target areas where there's "public concern about high levels of crime".
The Met has used LFR six times this year so far; three vans were deployed during the Coronation, while there have been three deployments in Camden and Islington.
Deployment decisions are made based on the "intelligence case" justifying the use of LFR, which in each case sees those who walk past the cameras immediately assessed against a watch list.
Any positive matches may be "spoken to potentially" or arrested, depending on the circumstances.
Ms Chiswick explained: "In live facial recognition, the watch list is bespoke and specific to deployment, and it's generated by looking at the intelligence case, so essentially the reason why we're using the technology in the first place.
"The watch list is then compiled based on crime types, based on
people likely to be in the area at the time who are wanted, whether they're wanted for serious offences, wanted on court warrants, and then after that deployment is complete that watch list is deleted."
The only exception to this is when a positive match is recorded of an individual on the watch list. Such matches can be retained for 31 days, or longer if needed "for judicial purposes".
Ms Chiswick says that the images of individuals not on the watch list are not only deleted but pixelated as part of what's known as a "privacy by design function".
"So if you're an officer sitting in a van, looking at that succession of people coming past the cameras, you can't even see the individual faces - they're all pixelated."
In recognition of the lingering concerns over LFR, any officer slated for involvement in its use has to undertake specific training.
"That bespoke training session is led by one of my officers who is an expert in facial recognition technology, and they run through the kit and how it works; they run through police powers and remind them of the need to make their own decisions at the point of getting that match..."
According to Ms Chiswick, officers are also reminded to be aware of the potential impact of demographic differentials.
In terms of what those six deployments achieved, the Director of Intelligence said there were four true alerts and zero false alerts, alongside two arrests.
"The others were correct identifications, but it was decided that
arrest was not a necessary action," she added.
Though increased arrests are one benefit, Ms Chiswick is clear that this is not what defines this technology.
Describing LFR as a "precision based community crime fighting tool", she believes it acted as a deterrent during the Coronation because its intended use was widely publicised beforehand.
Acknowledging that the Met is still at the early stages of a "careful" journey with this technology, Ms Chiswick's view is that its use is ultimately a debate about "privacy vs security".
She said: "Personally, I am relaxed about my image, my biometrics being momentarily taken, compared against a watch list, pixelated in the back of the van so people can't see my face...and if that helps prevent crime and stops a wanted offender being on the streets, I'm fine with that."