ABC chair warns ‘extremely autocratic’ views of some AI investors could lead to ‘dangerous and sinister’ outcomes | Australian Broadcasting Corporation

The ABC chief warned that AI could be “dangerous and sinister”, considering that some people funding AI have “extremely autocratic” views.
Kim Williams, who has been president of the national broadcaster since March 2024, is a prolific user of various AI applications such as ChatGPT, Gemini and Perplexity, and says it is important for people to understand the technology.
“I’m pushing myself to use it and trying to understand it… I don’t pretend to be an expert but I’m certainly actively and passionately interested because this is the next big technology that will change our world,” he told Guardian Australia.
Sign up: AÜ Breaking News email
“Like all technology, it’s an extremely useful tool. But it’s a tool, and we shouldn’t treat it the way others do… in a rather undisciplined, romantic way.”
Williams warned that technology could subsume the values of those who create or control it, and said he was concerned about what this could mean for artificial intelligence.
“There are many value structures reflected in AI that can be seen as potentially dangerous and sinister in some ways,” he said.
“Many of those involved in the financing and even the creation and leadership of some AIs [companies] – have unusually harsh views on human organization and politics, and in some cases… extremely autocratic views, believing that the majority belongs to an anointed minority.”
As someone who believes in democracy and the struggle of ideas, Williams said it is “clearly extremely socially dangerous” for people to limit and censor the views of those they disagree with.
“We should not underestimate the potential and power of these technologies – and we are seeing living examples of the technologies in the hands of some governments, where we have real-life evidence of how dangerous this can be.”
Asked whether media outlets including News Corp and the Guardian should sign deals with AI companies, Williams said everyone should bear the responsibility of using these technologies with a sense of the public and national interest.
“I’m talking openly about these issues with my colleagues because I think they’re extremely important.”
AI companies have failed to include a text and mining data exemption in Australian copyright law that would allow them to train AI on creative works without paying a fee. Last month, the Albanian government rejected such an exemption.
Williams, who chaired the Copyright Agency for six years, said people had the right to earn income from their creative work.
“Anything that would lead to compromise is unpleasant, unacceptable and, as far as I know, illegal,” he said.
“And you should get full defense and prosecution by the government. You should pay people who dedicate their lives to creating works.”
Williams said he thinks AI is potentially disruptive for entry-level jobs in industries like accounting or law, but he believes its impact on journalism jobs won’t be that detrimental.
“I think the impact on journalism will probably be much more positive. In fact, it’s much more positive than people think because journalists are smart, forward-thinking people and they will find ways that AI will really make them better and stronger and improve journalism,” he said.
“But boy, in a lot of other areas I think this could destroy jobs.”




