‘Gruesome videos’: social media pushes distressing news to children, experts say | Children

More than half of children who get news from social media become worried and upset after seeing content containing war, violence and death, according to new research that reveals that social media companies “broadcast” sad news to children who are not looking for it.
Research by online safety organization Internet Matters has found that videos of the Charlie Kirk murder, a Liverpool parade car ramming attack, war scenes, shootings, stabbings and car crashes have recently been pushed onto children’s feeds. As a result, 39% of those who saw disturbing content described themselves as being very or extremely upset and concerned by that content.
More than two-thirds of kids get news from social media apps like TikTok and Instagram, but 40% don’t follow news-focused accounts and instead encounter stories through recommendation algorithms. Almost two-thirds (61%) of those who get news from social media have seen a worrying or distressing story in the past month.
“You can see stabbings and kidnappings on TikTok, and it’s not nice to see, especially when you’re a little bit younger, it bothers you,” a 14-year-old girl told researchers.
Another 17-year-old girl said: “On Instagram, before the videos were taken down, I think there were a lot of scary videos of stabbings or something like that. When Liam Payne died [the One Direction star fell from a hotel balcony] There was a video of it floating around and I thought it wasn’t very nice. “I would like a trigger warning.”
“Sometimes you don’t want to see it and you don’t even want to think about it,” one 13-year-old boy said of the news popping up on his feed.
The survey findings reflect a broader trend in social media usage: The proportion of time users spend viewing content posted by their friends is declining across various platforms. Meta said court records This April, just 8% of an Instagram user’s time is spent viewing friends’ posts, a drop of more than a third since 2023; algorithmically suggested posts are instead increasingly dominating users’ feeds, contributing to the so-called “brain rot” phenomenon.
According to the study, which included a survey and focus groups with more than 1,000 children aged 11 to 17, the vast majority (86%) of children do not know how to reset the discovery algorithm on their social accounts; this serves up sad material.
“The move away from established news channels is fundamentally changing the way children and young people watch the news, with worrying consequences,” said Rachel Huggins, co-chief executive of Internet Matters, which advises parents and is partly backed by tech companies including TikTok.
“The algorithmic design of social media platforms enables children to see negative and upsetting content [and it is delivering it] “To the millions of children and young people who are not looking for this.”
Chi Onwurah, Labor chairman of the House of Commons science and technology committee, said the findings reinforced the conclusion that the Online Safety Act “is not sufficient to protect users”. He called for a stronger regime that “blocks the viral spread of misinformation, regulates productive AI, and places much-needed standards on social media companies.”
Rules to protect children under the Online Safety Act need Social media companies must provide children with age-appropriate access to content that depicts or encourages serious violence or injury. Younger children are less likely to see news about war and conflict, violence and crime than older children, but Internet Matters said: “Given that many social media platforms’ terms of service specify a minimum age of 13+, it is still worrying that children aged 11-12 are seeing this content.”
According to TikTok, human blood and extreme physical fights cannot be included in the “for you” stream, and the platform does not allow bloody, gruesome, disturbing or excessively violent content. It said it uses independent verification tools and allows parents to block teens from using the app and filter content during times they control.
Instagram has been approached for comment.




