UK Police Lobbied to Use Biased Facial Recognition Technology

December 10, 2025 06:31 AM
Documents reveal police have known for more than a year that the system was biased. Photograph: Leon Neal/Getty Images

Police forces successfully pushed for the use of a facial recognition system known to show bias against women, young people, and people from ethnic minority backgrounds—after complaining that a more accurate setting produced fewer possible suspects.

UK forces rely on the Police National Database (PND) for retrospective facial recognition searches, where a “probe image” of a suspect is matched against more than 19 million custody photos.

The Home Office acknowledged last week that the system is biased. A National Physical Laboratory (NPL) review found it wrongly identified Black and Asian individuals and women at much higher rates than white men. Officials said they have responded to these findings.

However, documents obtained by the Guardian and Liberty Investigates show that police leadership had known about this bias for over a year—and had lobbied to overturn an initial decision intended to reduce it.

Police chiefs were informed in September 2024 that the algorithm was skewed, after an NPL review found it was more likely to suggest false matches for women, Black people and anyone aged under 40.

The National Police Chiefs’ Council (NPCC) initially raised the system’s confidence threshold to reduce that bias significantly.

But this decision was reversed the following month after police forces complained that the higher threshold resulted in fewer “investigative leads”. NPCC records reveal that potential matches fell from 56% to 14% under the more cautious setting.

Although the Home Office and NPCC declined to disclose the current threshold, the latest NPL findings show the system can generate false positives for Black women nearly 100 times more often than for white women at certain configurations.

When releasing the findings, the Home Office said the tests showed that “in a limited set of circumstances, the algorithm is more likely to incorrectly include some demographic groups in search results.”

NPCC documents describe that while the raised threshold “significantly reduces the impact of bias across race, age, and gender,” it also “had a significant negative impact on operational effectiveness”, with forces saying that “a once effective tactic returned results of limited benefit”.

The government has now launched a ten-week consultation on expanding facial recognition use.
Policing minister Sarah Jones called it “the biggest breakthrough since DNA matching”.

But experts raised concerns.
Prof Pete Fussey, former independent reviewer of the Met’s facial recognition system, questioned police priorities, saying: “Convenience is a weak justification for overriding fundamental rights.”

Abimbola Johnson, chair of the independent oversight board for the police race action plan, criticised the lack of discussion around facial recognition despite “clear links” to racial equality concerns. She said policing’s repeated commitments to anti-racism are not being reflected in policy decisions.

A Home Office spokesperson responded: “We take these findings seriously and have already taken action. A new algorithm with no statistically significant bias has been independently tested and procured. It will be trialled early next year with full evaluation. Public protection remains our priority. Every stage of the process involves human oversight.”