Ofcom to speed up decision on rules enforcing tech companies to block illegal images online

Ofcom is pressing ahead with its decision on new rules requiring tech companies to block illegal intimate images online, citing the “urgent need for better online protection for women and girls”.
The regulator has previously recommended that sites and apps implement “hash matching” technology to detect and remove non-consensual intimate content, including deepfakes.
Originally planned for later this year, Ofcom’s final decision on these important measures has now been postponed until May.
Any new measures under the Unlawful Damages Act are expected to come into force this summer.
Decisions on other online security recommendations, such as how tech companies should quickly respond to spikes in harmful content, will be made in the fall.

Another suggestion would make livestreaming safer for children by preventing harmful interactions and stopping abuse.
Elena Michael, a campaigner with the group #NotYourPorn, said the announcement was “incredibly welcome” but she needed to see how the measures would work in practice.
He added: “Up to this point we have had a singular focus on criminalizing the first person or perpetrator to create such harm, but the nature of the internet means that once that image or video is created and shared, many other actors play a role in facilitating and proliferating the harm, meaning it is shared and re-shared many times.
“So actually going after an individual perpetrator is not enough.”
It comes as tech firms face strict new regulations as the government prepares an amendment to the Crime and Policing Act.
The bill would require intimate images without consent to be removed online within 48 hours of being reported.
Failure to comply may result in significant fines or denial of services in the UK
Sir Keir Starmer said it was the latest step in his “21st century fight against violence against women and girls” online and vowed to “take notice” of tech firms.




