TikTok ‘recommends sexual content and porn to children’, says report

Angus CrawfordBBC News Research
Getty ImagesAccording to a new report of a human rights campaign group, Tiktok’s algorithm proposes pornography and extremely sexualized content to children’s accounts.
The researchers created false children’s accounts and effective security settings, but still received sexually open search suggestions.
The recommended search terms led to sexualized material, including open videos of penetration sex.
The platform says it depends on safe and age -appropriate experiences and immediately moves after knowing the problem.
At the end of July and early August this year, researchers from the campaign group Global Witness have created four accounts about Tiktok as a 13 -year -old.
They used the wrong birth dates and were not asked to provide any other information to confirm their identity.
Pornography
They also opened the “limited mode” of the platform, in which users say that users said that they prevented them from seeing “sexually obscene or complicated themes such as obscene content”.
Without making any calls, the researchers found that the application of the application was clearly recommended in the “you can like” section of the application.
These search terms led to the content of women’s masturbation.
Other videos have shown that women flashing their underwear in public places or revealing their breasts.
At the extreme point, the content contained open pornographic penetration sex movies.
These videos are embedded in other innocent content in a successful attempt to avoid content control.
Ava Lee from Global Witness said that the findings came as a “big shock” for researchers.
“Tiktok does not only prevent children from accessing inappropriate content, but proposes to them as soon as they create an account”.
Global Witness is often a campaign group that investigates how large technology affects discussions about human rights, democracy and climate change.
Researchers stumbled this problem while doing other research in April this year.
Videos Removed
They reported to Tiktok, who said that he immediately took action to solve the problem.
However, in July and August this year, the campaign group repeated the exercise and found once again that the application proposed sexual content.
Tiktok says that it has more than 50 features designed to keep young people safe: “We are completely determined to provide safe and age -appropriate experiences”.
The application says that he has removed nine of 10 videos that violate their instructions before being watched.
When informed by the global witness about his findings, Tiktok said that “lifting the content that violates our policies and starting improvements in our search proposal feature”.
Child Codes
On July 25 this year, the children’s codes of the Online Security Law entered into force and implemented a legal task to protect children online.
Platforms should now use “highly effective age assurance” to prevent children from seeing pornography. They should also adjust their algorithms to prevent content that promotes self -harm, suicide or eating disorders.
Global Witness performed the second research project after child codes entered into force.
“Everyone acknowledges that we need to keep children safe online.
During his studies, the researchers also observed the reaction of other users to sexualized search terms where they were recommended.
A commentator wrote: “Can someone explain what happened to me with the calls Pls?”
Another asked: “What’s wrong with this practice?”




