Meta found in breach of EU law over ‘ineffective’ complaints system for flagging illegal content | Meta

The European Commission said Instagram and Facebook were violating EU law by not providing simple ways to report or flag illegal content, including child sexual abuse material and terrorist content.
Meta, the $1.8 trillion California company that operates social media services, took unnecessary steps in the process of users submitting reports, the EU’s executive body said in a preliminary finding on Friday.
Both platforms were found to use deceptive design in their reporting mechanism, known as “dark patterns,” which can be “confusing and discouraging” for users.
The Commission found this amounted to a breach of the company’s obligations under the EU-wide Digital Services Act (DSA) and meant that “Meta’s mechanisms for flagging and removing illegal content may be ineffective.” Meta denies breaking the law.
“In the case of Meta, neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘report and action’ mechanism for users to flag illegal content such as child sexual abuse material and terrorist content,” the commission said.
A senior EU official said the case was not just about illegal content, but also about freedom of expression and “moderation gone too far”. In the past, Facebook has been accused of “shadowbanning” users on topics such as Palestine, meaning their content is demoted by the algorithm.
The official said current mechanisms for complaints are “very difficult for users to go through”, which not only leads to ineffectiveness but also acts as a deterrent for users to get in touch.
Campaigners continued to claim there were security shortcomings in some of Meta’s products. Last month, Meta tipster Arturo Béjar published research that he said showed that many of the new safety tools rolling out on Instagram were ineffective and that children under 13 were not safe on Instagram.
Meta dismissed the report’s findings and said parents have solid tools at their fingertips. The company implemented mandatory teen accounts on Instagram in September 2024 and said this month it would adopt a version of the PG-13 movie rating system to give parents stronger controls over teens’ use of the social media platform.
The commission also noted that Meta makes things difficult for users whose content is blocked or whose accounts are suspended. It found that the decision appeal mechanism did not allow users to provide explanations or evidence to substantiate their objection, limiting its effectiveness.
The commission said simplifying the feedback system would also help platforms eliminate fake news such as: deepfake video in Ireland claimed that leading presidential candidate Catherine Connolly had withdrawn from Friday’s election.
The ongoing investigation was carried out in collaboration with Coimisiún na Meán, the Irish digital services coordinator responsible for regulating the platforms, whose EU headquarters is in Dublin.
The Commission also made a preliminary finding that TikTok and Meta breached their obligation to provide researchers with adequate access to public data that could be used to check the extent to which minors are exposed to illegal or harmful content. Researchers are often left with partial or unreliable data, he said.
“Allowing researchers to access platform data is a fundamental transparency obligation under the DSA, as it enables public scrutiny of the potential impact of platforms on our physical and mental health,” the Commission said.
The preliminary findings give platforms time to comply with the commission’s demands. Otherwise, they face fines of up to 6% of total annual worldwide turnover and periodic penalty payments to enforce compliance.
After the newsletter launch
Henna Virkkunen, the commission’s vice-president for technology sovereignty, security and democracy, said: “Our democracies are based on trust. This means platforms must empower users, respect their rights and open their systems to scrutiny.”
“DSA is making this a duty, not a choice. With today’s actions, we have published preliminary findings on researchers’ access to data on four platforms. We are ensuring that platforms are accountable to users and society for services guaranteed by EU law.”
A Meta spokesperson said: “We disagree with any suggestion that we have breached the DSA and continue to negotiate with the European Commission on these matters. In the European Union, we have made changes to our content reporting options, objection process and data access tools since the DSA came into force and we are confident that these solutions match those required under the law in the EU.”
TikTok said it was not possible to fully share data on its platform with researchers without violating separate GDPR data protection rules.
“TikTok is committed to transparency and values the contribution of researchers to our platform and the wider industry,” a spokesperson said. “We have made significant investments in data sharing and to date close to 1,000 research teams have had access to data through our research tools.
“We are reviewing the European Commission’s findings, but the requirements to relax data protection measures put the DSA and the GDPR in direct tension.”
The company called on regulators to “provide clarity on how these obligations should be reconciled.”




