Irish police investigating 200 reports of Grok child abuse material

There are hundreds of open investigations into content shared on
Irish politicians met with Irish police and other experts on Wednesday for an Oireachtas Media Committee hearing that largely addressed growing concerns about the spread of CSAM and other AI-generated sexualized material on the social network.
They were told that potentially 200 reports relating to the platform were being investigated by An Garda Siochana, the Irish police force, which implicated CSAM.
Detective Chief Superintendent Barry Walsh said: “We have received reports and content referrals regarding the platform in question (X) which is under investigation.
“The investigation process takes some time, the content needs to be assessed to ensure it is criminal and then, if possible, the individuals responsible need to be identified and the investigative action starts from there.
“The investigation process will then begin, which may result in a variety of different actions, such as the execution of arrest warrants, interviewing those responsible, bringing the interviews before the court, or as directed by the Director of Public Prosecutions.”
He added: “As of this morning, there are 200 reports under investigation of child sexual abuse material or content that is indicative of child sexual abuse material.”
The senior officer said it was all about Grok.
Mr Walsh said gardai were in contact with Irish media regulator Coimisiun na Mean regarding AI-generated CSAM.
He said Gardaí believed that existing legislation allowed them to carry out an investigation into the material and that they had not yet encountered anything that would prevent them from carrying out an investigation.
Mr Walsh said he wanted to reassure the public that the reports were “treated with the utmost seriousness” and were fully investigated.
In his written submission he said: “I would encourage any person who may be a victim of these crimes to contact your local Garda station where you will be provided with access to the wide range of specialist help and support available.
“Victims of private image exploitation also have the option to report online via Hotline.ie.”
Mr Walsh said recent comments had focused on “one AI model in particular” but the reality was that there was a “conceptual possibility” that other AI models could be trained to create this type of content.
He called for a “robust response” from AI service providers to ensure their models are not manipulated to create content that is “both illegal and extremely harmful to affected individuals.”
Mr Walsh said there was a minimum step for online service providers to ensure that material distributed on their platforms was suitable for receiving audiences and that it was effectively evaluated for accuracy; but it was clear that this was not the case at this time.
The officer attached to the Garda National Cyber Crime Bureau said levels of CSAM produced and distributed online were increasing.
Mr Walsh said gardai had made proactive efforts to locate CSAM online but were mainly dealing with referrals made through US private agency the National Center for Missing and Exploited Children.
He said referrals are increasing on a yearly basis, with 13,300 in 2024 and nearly 25,000 in 2025.




