Met complaints over image-based sexual abuse more than doubled in the last five years

Reports to the Metropolitan Police about incidents of image-based sexual abuse and apparent deepfakes have more than doubled in the past five years; The Home Office has promised to crack down on criminals, new figures show.
The Metropolitan Police has received a 120 per cent increase in complaints about non-consensual misuse of intimate images (NCII), according to FOI data obtained by online security provider Verifymy.
The rapid rise of AI-powered “nudification” tools used to remove people’s clothing in photos is making it easier to create and share realistic, sexually explicit images without consent, the security provider has warned.
According to complaints data seen by the Metropolitan Police, around 1,766 NCII complaints were made in Greater London last year; The number was more than double the 805 recorded in 2020, up 17 percent from 1,523 the previous year. Independent.
These findings come months after Ofcom launched an investigation into how the Grok AI chatbot in X was used to create and share sexual deepfakes of real people, including children.
There is a more than 260-fold increase in AI-generated child sexual abuse videos in 2025 compared to the previous year, the Internet Watch Foundation said last week.
The watchdog warned that video models, nudification apps, subscription platforms and mediated artificial intelligence systems enable criminals to produce and distribute illegal content, allowing them to manipulate images of real children and simulate explicit conversations with child characters.
Under the new Crime and Police Bill, which is currently in the final stages of legislation, social media platforms will have to remove reported non-consensual images within 48 hours. Those who don’t want to risk hefty fines or having their services blocked in the UK. Nudification tools used for AI deepfakes will be banned under new rules.

NCII victims will have from the current six months to three years to report the crime.
While the Crime and Policing Bill looks set to collapse the NCII, experts warn platforms need to do more to combat the growing harm.
Emma Robert-Tissot, Head of Partnerships at Verifymy, said: “In the age of hyper-realistic rendering, everyone should have control over how their identity is used and represented online. The permission management that underpins this is no longer a technicality, but a fundamental right.”
“While content moderation plays an important role, it cannot identify all forms of non-consensual intimate image exploitation, especially as synthetic content becomes more advanced. Platforms therefore need to take a more holistic approach combining prevention, consent and detection to effectively combat this growing harm.”

Commenting on the FOI findings, a Met Police spokesman said tech firms should take precautions against NCII attack methods.
“Non-consensual intimate image (NCII) exploitation can have a devastating and lasting impact on victims. The online world is changing rapidly and reporting of such crimes has increased significantly over the past five years,” they said.
“We continue to strengthen our response to technology-enabled abuse by supporting specialist teams and investing in new technology. This includes technology that allows officers to review large volumes of messages and an NCII toolkit that provides vital information about what this abuse looks like and its impact on victims to enable police to improve their response.
“As we use technologies to provide support to victims and work with our protection partners, we continue to call on technology companies to design these attack methods.”
A government spokesman said: “Sharing or creating intimate images without consent is a despicable crime and we are taking immediate action to tackle this growing problem.
“We have made the creation of intimate images without consent an offense punishable by up to six months in prison and are banning AI tools that produce deepfake sexual images of people without consent; developers and suppliers will face up to three years in prison.”




