The Metropolitan Police Service has announced plans to explore the use of artificial intelligence (AI) to improve how officers identify victims of online child sexual abuse and assess harmful content. The move comes as cases continue to rise, placing increasing pressure on investigators and safeguarding teams.
According to the Met, more than 5,400 child sexual abuse offences were investigated over the past year, with over 1,300 children requiring protection. Officials believe AI technology could help detect previously unidentified victims faster and significantly reduce the time it takes to intervene.
Deputy Commissioner Matt Jukes said the adoption of AI could also limit officers’ exposure to highly distressing material. He stressed, however, that human judgement, strict oversight, and victim-focused care will remain central to all investigations.
Faster Identification, Reduced Harm
Currently, police officers must manually review large volumes of abusive material to identify victims and link cases. This content is classified into categories A, B, and C based on severity. The Met is now working with technology companies to develop AI tools that can assist in this process while minimizing direct exposure to harmful imagery.
In addition, the force is considering a separate system capable of reviewing and risk-assessing up to 641,000 messages in just 35 minutes—potentially transforming the speed and efficiency of investigations.
Despite the promise of these technologies, the Met emphasized that any AI deployment would operate within strict legal, ethical, and safeguarding frameworks. Specialist officers will continue to make all final decisions.
Ongoing Concerns Over Privacy
The use of AI in policing has sparked debate in recent years, particularly regarding live facial recognition technology. The Met is currently facing a High Court challenge from campaign groups who argue that such systems may violate privacy rights and risk discriminatory use.
While the police maintain that AI tools are essential for tackling modern crime, critics continue to call for clearer regulations and stronger safeguards.
Investment in Victim Support
Alongside its AI plans, the Met has announced a £10 million investment in new Visual Recorded Interview (VRI) suites designed to improve the experience of victims—especially children.
These upgraded facilities aim to provide a calmer, more supportive environment for interviews, offering an alternative to traditional police stations. A total of 23 locations, including Brixton, Holborn, and Bethnal Green, have been selected for refurbishment, with six sites already completed.
Plumstead Police Station has been chosen as the pilot location for the initiative.
London’s Victims’ Commissioner, Andrea Simon, welcomed the development, stating that while improved facilities are a positive step, consistent care and support throughout the justice process remain crucial. She highlighted that many victims withdraw before cases reach a charging decision, underscoring the need for a more compassionate and supportive system.
A Step Toward Modern Policing
As digital crimes evolve, the Met’s exploration of AI reflects a broader effort to modernize policing methods while prioritizing victim protection. However, balancing technological innovation with ethical responsibility and public trust will remain a key challenge in the years ahead.