Police tweak ‘biased’ facial recognition software
A police force has paused the use of live facial recognition (LFR) cameras after a study found it was statistically more likely to identify black people than other ethnic groups.
Essex Police has used the technology since summer 2024, but the study identified “a potential bias in the positive identification rate” of black people over white people on its watchlist.
The force said that following updates to its algorithm and software, it was confident that LFR cameras could be deployed again.
But campaign group Big Brother Watch said the technology was “authoritarian, inaccurate and ineffective in equal measure”.
Essex Police said it commissioned two independent studies into its use of LFR – carried out by the University of Cambridge, external – with one of them indicating the potential bias.
In that study, 188 volunteers acted as members of the public in a controlled field experiment during a real police deployment, with the system correctly identifying about half the people on the watchlist who passed the cameras.
But the report said the system was more likely to correctly identify men than women, and “it was statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.
The second study – which analysed more than 40 deployments of LFR technology between August 2024 and February 2025 – found it had scanned approximately 1.3 million faces in public spaces.
It said that officers intervened 123 times to speak to people and carried out 48 arrests – roughly one for every 27,000 faces scanned – and there was one confirmed mistaken intervention.
Source: (BBC)
