google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
Australia

Artificial intelligence fuels conspiracy theories on right and left

“Look closely,” Allison McSorely says in an Instagram Reel. He draws attention to a detail about the tie used by US President Donald Trump at the Charlie Kirk memorial ceremony in Glendale, Arizona. A photo from Getty Images shows diagonal stripes, while another photo from the Associated Press does not.

“This won’t happen,” McSorely says in the video. “It is time to ask why media outlets are flooding TV and social media with photos showing clear signs of AI manipulation without explanation.”

Allison McSorely points out inconsistencies between photos at the Charlie Kirk memorial (Image: Instagram)

McSorely runs several social media pages that purport to focus on internet security. It dedicates its online presence to informing users about how to identify AI-generated content, and its infographic-style posts are seemingly reminiscent of popular posts in liberal and leftist spaces.

But it’s almost certain that the photos at the Charlie Kirk memorial were not produced or manipulated by artificial intelligence. AP and Getty have strict criteria for editorial images, and generative AI is never allowed.

Charles-McClintock Wilson is a professional news photographer for Utah-based Zuma Press. Wilson was on the scene at Utah Valley University when Kirk was assassinated and took some of his last photographs. He says the difference between the photos in Allison’s video could be explained by a variety of ways that aren’t generative AI.

“The link color difference between AP and Getty Images seems questionable, especially at an exciting moment like Charlie Kirk’s memorial service. From a professional standpoint, I don’t believe the images are AI-generated or deliberately manipulated. The difference is explainable: red tones shift in mixed lighting, and editorial workflows vary. Getty tends to favor high contrast and saturation, while AP tends to be neutral. Add white balance, angle and post-processing and you get the difference.

“But even when technically explainable, visual inconsistency triggers distrust. That’s the deeper problem. In the post-AI environment, audiences are ready to question authenticity.” [McSorley’s] While it may not be accurate in terms of fabricating suspicion, it reflects a real rupture in which institutions are less united and visual media can no longer guarantee trust.

Related Article Block Placeholder

Product Code: 1221368

A strange picture of Australia's right in 26 photos from Melbourne's latest anti-immigration march

“Generative AI did not create distrust. It just accelerated it. In our editorial work, we have always struggled with doubt: was the photo staged? Was the moment real? But now even real images are questioned at first glance. The boundary between documentation and simulation has blurred, and audiences are primed to doubt.”

“As a news photographer, I see this every day. Light changes, editorial tone, and post-processing can create inconsistencies that feel artificial, even if they aren’t. AI hasn’t replaced reality, but it has changed how reality is perceived. That’s the disconnect: not in the image itself, but in the public’s relationship to it.”

At the time of this writing, McSorely’s video had 6,496 likes on Instagram. He did not respond to our request for comment.

As social media consumption increases, misinformation and disinformation spread further and faster than ever before. Partly as a result, trust in institutions, particularly governments and the fourth estate, has fallen significantly. During the Covid-19 lockdowns of 2020 and 2021, conspiratorial ideologies, many originating from the US, exploded into the mainstream. During this time, tens of thousands of Australians protested COVID restrictions, with signs and rhetoric referencing theories spread on social media.

An anti-immigration protester wearing a gas mask confronts police in the Melbourne CBD on August 31, 2025

The rise of conspiracism in Australia has increased rapidly since then, and some currents of thought have developed into social movements in their own right. The best known of these are sovereign citizens and QAnon. These radical and largely unfounded ideologies have made the long journey across the Pacific and no longer exist solely online.

A. declassified Australian Federal Police document He notes that as of 2023, the sovereign citizen (SovCit) movement has no clear structure, often using public social media to recruit members, then turning to encrypted, less regulated online spaces when deplatformed by social media companies. The document argues that the government’s response to the pandemic and the influence of the US and Europe have “reinvigorated” the movement. His pseudo-legal ideas of “finding common ground with anti-vax groups, conspiracy groups and right-wing extremists” have spread throughout Australia.

Conspiratorial thought has often been associated with the political right, but traditionally liberal and leftist areas appear to be increasingly embracing them. This phenomenon, derisively referred to as BlueAnon, saw a sharp rise after the assassination attempt on now-President Trump in 2024. Morning Consultation According to the poll, 1 in 3 Biden supporters believed the attack was staged as a false flag incident to help Trump ascend back to the White House. A. wired report Since the days after the 2024 US Presidential Election, tens of thousands of posts have been described on X (formerly Twitter) questioning the election results and claiming foul play without evidence.

Related Article Block Placeholder

Product Code: 1222247

Follow Dezi Freeman's radicalization from family photographer to alleged cop killer

Conspiratorial worldviews considered “right-wing” have spread and continue to spread globally. As opposing theories develop, they will likely spread, and they are already spreading.

The content of McSorely, who accuses many videos and photos of being doctored or created with artificial intelligence tools, is similar to the following: Reptilian Shapeshifter conspiracy theories It was popular with a seemingly very different audience in the late 2000s and 2010s. In hyper-analyzed news and celebrity images, so-called inconsistencies are popping up everywhere.

Awareness that photos, videos and sounds can be replaced with artificial intelligence has caused discussions about media to change in recent years. The continued advancement of such tools makes it increasingly difficult to distinguish fake content from genuine material, and concerns about their use to create misleading and falsified information are well-founded.

RMIT’s ADM+S Center released a report explored how journalists and news organizations use artificial intelligence in media and audience interpretation. Most of those interviewed said that news production should be regulated and that they wanted rules on the use of artificial intelligence to be implemented. Eighty-five percent of the sample said there is a need for transparency around AI-generated content. How The use of generative AI allows the reader to form their own opinions on the material. Concern about artificial intelligence being used in journalistic applications is further perpetuated by ever-advancing technology that leads to the spread of conspiracy theories online. In President Trump’s almost incomprehensible words while admiring a Tesla in front of the White House in March (which later became an instant meme, of course). “Everything is computer.”

It would be wise to question what exactly is real. But in an age where photo, video and audio processing and creation tools are increasingly advanced, that answer may be harder to find. In fact, with the normalization of mistrust and disbelief across all political lines, the answer to this question may not even matter to a growing group of people.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button