More children seeing violent and degrading pornography online, says commissioner

The proportion of children who say that most pornography has seen them has increased in the last two years, according to a report that most of them accidentally stumbled.
Children’s commissioner Dame Rachel de Souza said that his research was the proof that he is presenting harmful content to children through dangerous algorithms rather than what was sought.
He explained the content of young people as “violent, extreme and humiliating” and usually illegal, and said that the findings of his office should be seen as “an image of what the rock base resembles”.
More than half of the respondents (58%) reported that they see pornography with pornography as children, 44% of them see the depiction of rape – especially sleeping.
The report, which consists of 1,020 people between the ages of 16 and 21, reported that children were 13 years old when they first saw pornography, and that they were more than one -quarter (27%) 11 years old and some were “six or smaller”.
The research suggests that four of the 10 participants could be convinced of having sex even if they said no at first, and that the young people watching pornography were more likely to think in this way.
In the report conducted by the Children’s Commissioner Office in 2023, the report was found higher (70%) by saying that they had seen online pornography before 64% of the participants reached 18 years ago.
Men (73%) were more likely to report online pornography (65%).
The majority of children and young people (59%) said that they were online pornography – an increase from 38% in 2023.
The X platform, which was formerly twitter, remained the most common source of pornography for children, says they see 45% there, 35% saw in special pornography – a gap that expanded in the last two years.
Dame Rachel said: “This report should act as a line in the sand. The findings determined how much platforms should change to keep the children’s children safe.
“For example, take a large number of children who have accidentally seen pornography. This tells how much of the problem is.
In May, the research was carried out before the new online security measures, which came into force last month, including age controls to prevent children from accessing pornography and other harmful content.
Dame Rachel said that measures provide a real opportunity to make the security of children a real opportunity to make the safety of children into an indispensable priority for everyone: policy makers, large technology giants and smaller technology developers.
Approximately 44% of the respondents agreed that “Girls may say no at first, but later can be convinced to have sex ,, while one -third (33%) said,“ Some girls really didn’t really want sex when they had done. ”
For each expression, young people who were pornography were more likely to have the same agreement.
The commissioner’s report comes as a separate research part that proposes that dangerous online algorithms continue to propose suicide, self -harm and depression content only weeks before the entry into force of new online security measures.
Molly, his 14 -year -old daughter Molly, after receiving harmful content on social media, receiving his own life, founded by his old father Ian Russell, Molly Rose Foundation, as a England -based 15 -year -old girl registered in accounts until November analyzed the content in Instagram and Tiktok.
In the young accounts interested in philanthropy, research, suicide, self -harm and depression tasks, algorithms continue to be bombed with the page for harmful content tsunamis in Instagram reel and Tiktok’s page ”.
“Eight years after Molly’s death, incredibly harmful suicide, self -harm and depression content are still widespread on social media,” said Mr. Russell, the President of the Foundation.
Since the foundation was not strong enough before, the child had criticized the security codes and said that his research showed that he was “proposed to vulnerable users and ultimately did much to prevent more deaths like Molly’s”.
Mr. Russell added: “For more than a year, it has been seen at the Prime Minister’s clock, which is completely prevented, and it is time to bring the laws that have been strong and strengthened, where Offom is timid, and to bring life -saving laws without delay.”
META Spokesperson, the owner of Instagram, said: uz We do not agree with the claims of this report and the limited methodology behind it.
“Tens of millions of young people are currently offered established guards in Instagram young accounts who can communicate with them, the content they see and the time they spend on Instagram.
“We continue to use automatic technology to eliminate the content that encourages suicide and self -harm that is proactively traded before being notified to us.
“We have developed young accounts to help protect young people online and continue to work without getting tired to do so.”
A Tiktok spokesman said: “Young accounts in Tiktok have more than 50 features and settings designed to help them express, discover and learn safely, and parents can further customize more than 20 content and privacy settings through family matching.
“Findings with more than 99% of the infringement content that is proactive by Tiktok does not reflect the real experience of the people on our platform accepted by the report.”