google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Watchdogs call on big tech to ‘urgently’ do more to protect children online

Two UK watchdogs have made an urgent appeal to social media companies to strengthen their age-checking processes, saying they must “take immediate action” to keep children safe online.

In an open letter, the Information Commissioner’s Office (ICO) and Ofcom, which oversees data and communications regulations in the UK, accused tech platforms such as Meta, Snap and TikTok of “failing to put children’s safety at the heart of their products”.

The statement comes after protesters gathered at Meta’s headquarters in London on Wednesday, accusing the social media giant of designing “addictive” and “dangerous” algorithms that damaged their mental health during adolescence. In the US, the platform is facing legal action over similar allegations.

Tech companies must do more to stop children using social media platforms, regulators say

Tech companies must do more to stop children using social media platforms, regulators say (AFP/Getty)

In the letter sent to Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube, the platforms are asked to explain what measures they have taken regarding age checks and personal care protection before the end of April.

Regulators warn that children can “easily exceed” current age limits and say there is “no excuse” for the lack of effective age limits. ICO chief executive Paul Arnold said most services used self-declaration to determine whether children were aged 13 or over, and that this method could easily be bypassed and was therefore ineffective.

“There is growing public concern that the status quo is not working and the industry needs to do more to protect children,” he added.

Tech firms are increasingly under fire for strengthening protections for children online; Critics warn that they often leave young people “vulnerable to harm and exploitation.”

A recent study found British children were potentially exposed to guns, self-harm, misogyny and sexting within minutes of creating social media profiles, while others said they believed the “addictive” design of social media played a part in their children’s suicide.

Ofcom said children were “regularly exposed to risks” without appropriate protections, such as strict age checks.

A recent study found that British children are potentially exposed to guns, self-harm, misogyny and sexting within minutes of creating social media profiles.

A recent study found that British children are potentially exposed to guns, self-harm, misogyny and sexting within minutes of creating social media profiles. (P.A.)

Chief executive Dame Melanie Dawes said: “These online services are household names, but they don’t put children’s safety at the heart of their products. “There’s a gap between what tech companies promise privately and what they do publicly to keep children safe on their platforms.

“Without the right protections, such as effective age checks, children are routinely exposed to risks associated with services they did not choose and cannot realistically avoid. This needs to change quickly, or Ofcom will take action.”

Ofcom said it would report publicly on responses from the platforms it contacted in May, as well as publishing new research into how much, or how little, children’s online experiences have changed in the first year of the Online Safety Act coming into force.

The regulator said that if platforms were not satisfied with their response, “we would be prepared to take enforcement action” and could consider strengthening regulatory requirements under existing industry rules “to enable further change”.

In February, the ICO fined Reddit more than £14 million for allegedly failing to protect children using its platform. The watchdog’s investigation found that Reddit failed to check the age of users on its platform, putting children at risk.

The government has launched a consultation on whether to ban social media for under-16s but on Monday ruled out a vote on banning them entirely. MPs instead backed the government’s proposal after consultation to give ministers additional, more flexible powers.

A Roblox spokesperson said: “Roblox is fiercely committed to security and we are in regular dialogue with Ofcom about how we protect our community of players. In the last year alone we have introduced more than 140 new security features, including the introduction of new mandatory age checks that all players must complete to access chat features on Roblox. These age controls are designed to limit communication between adults and children and ensure players can only chat with others of a similar age, addressing a key concern highlighted by Ofcom today.

“Age checks are just one of a number of security measures on Roblox, including age-appropriate content restrictions, chat monitoring and 24/7 content moderation. While no system is perfect, we continue to strengthen protections designed to keep players safe and look forward to demonstrating our efforts in our ongoing dialogue with Ofcom.”

Meta, Snap, TikTok and YouTube have been contacted for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button