Telegram faces major Ofcom probe over child sexual abuse concerns

Ofcom has launched a formal investigation to determine whether messaging app Telegram has “failed or failed” to tackle child sexual abuse material.
The UK’s online safety regulator launched its investigation after receiving evidence from the Canadian Center for Child Protection alleging the existence and sharing of such illegal content on Telegram.
Following its own assessment, the UK regulator decided to launch an investigation into Telegram’s possible failures “to comply with its duties in relation to illegal content”.
Under the UK’s Online Safety Act, providers of so-called user-to-user services such as Telegram “are required to assess and reduce the risk of this terrible crime being committed on their platforms.”
Ofcom said firms that fail to do what is expected of them to protect children “will face serious consequences”.
Suzanne Cater, director of enforcement at Ofcom, said: “Child sexual abuse and exploitation causes devastating harm to victims, and ensuring sites and apps tackle it is one of our top priorities.

“That’s why we work so closely with our partners in law enforcement and child protection agencies to identify where these harms occur and hold providers accountable where they fail to meet their obligations.
“There has been undeniable progress, particularly in file-sharing services that are often used to share horrific images of child sexual abuse.
“But this problem extends to major platforms, and teen-focused chat services are too easily used by predators to groom children. These companies need to do more to protect children or they will face serious consequences under the Online Safety Act.”
If failures to comply with the law are detected, Ofcom could impose fines of up to £18 million or 10% of eligible worldwide revenue, whichever is greater.
Ofcom also said it could seek a court order requiring internet service providers to block access to the service in the UK in the most serious cases.
The regulator also announced on Tuesday that it had launched an investigation into whether the providers of chat services Teen Chat and Chat Avenue had “taken appropriate steps to assess and reduce the risk that UK users will encounter illegal content and activities, including grooming”.
The watchdog said its work with child protection agencies had raised concerns about risks to children on the platforms, both of which it described as chat rooms, private messaging and media sharing functions.




