Crikey responds to AI claim about article published on March 18

Yesterday, we published an article from a participant who later confirmed that they were using AI in some aspects of production.
This is against our editorial policies. As a result, we removed the story and the three previous stories in the series.
This came to our attention when we were alerted to a post by AI expert Toby Walsh, who suspected AI was being used in writing the article. After reviewing the claim, we reached out to the author of the article and he told us that they used ChatGPT in the production of the article. They cooperated fully with us by providing drafts and notes. They explained to us that generative AI is not being used to write their articles wholesale, but rather to sensory-check their work, make corrections, spell check, ask for alternative subheadings, and in some cases, ask for better wording.
Our editorial rules prohibit the use of artificial intelligence, so we removed the stories. Although our editorial guidelines are publicly available online, we did not provide the author with a copy of them before submission of the article, which we should have done.
Speaking more generally about the use of artificial intelligence to produce journalism, readers know that we are working hard on this topic. We’ve banned the use of generative AI by our reporters and contributors (or as we put it: “until AI can produce a headline as sharp or beautiful as ‘Pissing in the Sink,’ we will never use it to write our articles”) and added the following clause to our editorial rules:
“cricket The use of generative AI to write, sample, or otherwise produce our work is generally prohibited. You can read our full rationale here. Its use may be permitted and will be disclosed to readers where editorially necessary, such as when writing about technology. We also request that outside contributors refrain from using productive AI tools while working. cricket and reserves the right to refuse this work.
cricket may use other AI and machine learning technologies to assist with editorial processes, such as transcribing audio or analyzing datasets, where appropriate.
Even as the technology becomes more widely adopted, we remain committed to not using generative AI to write or produce stories. From where? We promised our readers cricket Original, human-made journalism will always be worth supporting.
We are still learning how to make sure this is the case. We need to be clearer with new entrants about these expectations, including where AI is used in limited capacity. We need better fallback measures so we can catch issues like this before the story is published.
The broader issue of artificial intelligence used in journalism is another challenge that small media outlets like ours, or any media outlet for that matter, face when it comes to maintaining the trust of our readers. We hope that clearly explaining to you our mistake, how it happened, and what we learned from it will help you maintain that trust.


