Children being ‘failed’ by tech companies as sex abuse image crimes rise by almost 10%

Child sex abuse visual crimes recorded by UK police forces rose by almost 10 per cent last year, sparking fresh calls for tech companies to take decisive action to prevent nude images being captured and shared on children’s devices.
Young people continue to be exposed to serious risks of grooming, extortion, online harassment and non-consensual sharing of intimate images, the NSPCC has warned.
The charity’s latest research underlines the persistent threat.
Between 1 April 2024 and 31 March 2025, a total of 36,829 offenses involving indecent and prohibited images of children were recorded in the UK; This was up zero percent from the previous year.
This alarming figure, obtained from responses to a Freedom of Information request from 42 of 45 UK police forces, represents a significant increase from the 33,886 crimes documented in the previous year.
The government’s strategy to tackle Violence Against Women and Girls (VAWG), published in December, stated the aim was to “make it impossible for children in the UK to take, share or view a nude photo” and was “working constructively with companies to make this a reality”.
But the NSPCC said it should be made mandatory and called on the Government to take action against tech companies if they fail to embed existing technology into children’s phones which prevents nude images being created, shared or viewed.
The charity said these “device-level protections” should be placed by default, meaning children will automatically be protected and adult users can go through a process to disable this option.
Such technology can intercept a nude image taken, sent or received on a device, and the NSPCC said that because the image was never created or sent in the first place there was nothing to encrypt and this method could stop abuse at the source.
The NSPCC said 43 per cent of the 10,811 crimes police forces recorded using the platform perpetrators took place on Snapchat, or 4,615 in total.
Overall, Meta platforms still account for almost a quarter (24 per cent) of all crimes, with 8 per cent occurring on Instagram, 7 per cent on WhatsApp, 5 per cent on Facebook and 4 per cent on Messenger, the charity said.
But the NSPCC said the true extent of abuse children face online remains “hidden” due to end-to-end encryption.
NSPCC chief executive Chris Sherwood said: “Children in the UK are being completely failed by the technology companies that are supposed to protect them online. We cannot continue to bail them out of this situation when we could be doing more to prevent this from happening in the first place.”
He added: “There is already technology available today to stop children from taking, sharing or receiving nude photos. So the question is: What’s stopping them? If they continue to drag their feet, the government must step in and show its power by forcing them to take action.”
Kerry Smith, chief executive of the Internet Watch Foundation, said the data “should be a new wake-up call”, adding: “Compulsory enforcement of on-device protections will protect children from unwanted nude images and being forced to send sexually explicit material.
“We need to see these measures implemented across the board.”
Conservation Minister Jess Phillips said the data revealed by the NSPCC was “deeply shocking”.
He added: “This cannot go on without predators being stopped and controlled. We plan to stop them.”
“We are committed to making it impossible for children in the UK to receive, share or view nude images, and have already announced a ban on so-called ‘nudification’ practices to stop the creation and dissemination of abusive images in the first place.
“We will not hesitate to go further until our children are protected from online sexual abuse.”
Earlier this year, it was announced that nudification practices would be criminalized under the Crime and Policing Bill currently going through Parliament.
The data comes after two watchdogs last week warned big tech must do more to protect young people online.
Communications regulator Ofcom has given Facebook, Instagram, Snapchat and others until the end of April to explain what action they are taking on age checks and grooming protections.
Alongside Ofcom’s demands, the Information Commissioner’s Office (ICO) has also written to Snapchat, Facebook, Instagram and others, asking them to explain how their age assurance policies keep children safe.
The NSPCC said the Police Service of Northern Ireland and Constabulary Scotland were included in the data, but the missing forces were Gloucestershire, Hampshire and Thames Valley.




