Charity warns AI-generated child sex abuse videos ‘now as lifelike as real footage’

A leading aid organization is no longer “distinct” from real shots of child sexual abuse videos produced by AI.
The Internet Watch Foundation (IWF), which finds guilty images online and helps to lift, said criminals are increasingly realistic and more excessively, and that technology can soon create and distribute feature films of such materials.
Very realistic videos are no longer limited to short, disruptive clips, which are now common with technology, perpetrators are currently using the similarities of real children to produce videos on a wide scale.
The new IWF data released on Friday were discovered in the first half of this year’s sexual abuse videos produced by 1,286 Individual AI.
In the same period last year, such two videos were discovered.
A government minister described the figures as “completely terrible ve and said that the criminals behind the videos were“ as disgusting for those who pose threats for children in real life ”.
The IWF was convincing that all the videos approved in 2025 were so convincing that they were so convincing, and that they should be handled according to British laws as if they were real images.
More than 1,000 of the video, rape, sexual torture and sexual intercourse with animals, the most extreme category was considered as an image.
The data also showed that compared to the 42 web page in 2024, in the first half of this year, the sexual abuse of children produced by AI on 210 separate website was discovered and the images of the charity increased by 400 percent.
Each web page can contain multiple pictures or videos.
The figures come after IWF’s 291.273 report of the IWF’s sexual abuse images were reported last year.
The aid organization government called on the safe development and use of AI models by bringing a binding arrangement that enables the designer of technology to be abused.
Derek Ray-Hill, General Manager of IWF’s temporary General Manager, said: ız We have to do our best to prevent synthetic and partly synthetic content flood that we have already fought on online.
Orum I’m horrified to see that technology continues to develop rapidly and continues to be exploited in new and disturbing ways.
“As we can see with still images, the AI videos of the child sexual abuse has now reached the point where they could not be distinguished from real films.
“Children depicted are often real and recognized, the damage of this material is real and threaten to increase the threat of its threat.”
Mr. Ray-Hill said that the government should “get a concept ında because the criminals were“ very easy to produce videos, and child sexual abuse films produced by this feature AI were inevitable.
“The Prime Minister has recently committed that the government would provide technology to create a better future for children. Any delay has brought back the efforts to protect children and to fulfill the government’s commitment to violence against girls.
“Our analysts contain almost all these AI abuse images. It is clear that this is another way to target girls online and to be in danger.”
“These statistics are completely terrible. Those who commit these crimes are as disgusting as those who threaten to children.
“The child’s sexual abuse material produced by AI is a serious crime, so we have introduced two new laws to destroy this inferiority material.
“Soon, the perpetrators who have materials or tools that teach them to manipulate legitimate AI vehicles will face longer imprisonment and we will continue to work with regulators to protect more children.”
In IWF, an anonymous senior analyst said that the AI Child Sexual Abuse Image Creators had video quality of the last year, which are “splashes and boundaries”.
“The first AI child sexual abuse videos we see were not sophisticated, but it could still be quite convincing,” he said.
“The first totally synthetic child sexual abuse video we saw at the beginning of last year was brought together only a series of shaking images, it wasn’t a convincing thing.
“But now they really have returned to a corner. The quality of the quality of the crime, which is high quality and depicted, becomes more excessive as the vehicles develop in their ability to produce video showing two or more people.
“Videos also contain sets showing victims known in new scenarios.”
The IWF advised the charity to inform the charity anonymously, including exactly URL where the content is located, and to report the images and videos of child sexual abuse only once.




