Revealed: How Substack makes money from hosting Nazi newsletters | Substack

The Guardian’s investigation has revealed that global publishing platform Substack generates revenue from newsletters promoting violent Nazi ideology, white supremacy and antisemitism.
The platform, which is said to have around 50 million users worldwide, allows the public to publish their own articles and charge for premium content. Substack takes about 10% of the revenue generated by newsletters. Nearly 5 million people pay for access to newsletters on its platform.
These include newsletters that openly support racist ideology. One, called NatSocToday, which has 2,800 subscribers, charges $80 (about £60) for an annual subscription, but most of its posts are available for free.
NatSocToday appears to be run by a US-based far-right activist, and his profile picture includes a swastika, a symbol adopted by the Nazi party to symbolize white supremacy in the 1920s. The full name of the Nazi party was the National Socialist German Workers’ Party.
One of his last posts suggests that the Jewish race was responsible for World War II and describes Adolf Hitler as “one of the greatest men of all time.” For the purposes of this investigation, within two hours of subscribing to NatSocToday, the Substack algorithm redirected the Guardian’s account to 21 other profiles with similar content.
Some of these accounts regularly share and like each other’s posts. Most of them have thousands of followers.
Erika Drexler, a distinctive “NS” [national socialist] The activist, who has 241 subscribers, shared posts describing Hitler as his hero and “the most overqualified leader ever.” The account is also believed to be based in the US and charges $150 for an annual subscription.
Ava Wolfe, who has 3,000 subscribers and describes herself as “an archivist of historical articles and videos, particularly related to World War II,” appears to reside in England. He has a profile that includes swastikas and other Nazi symbols. An annual subscription to Substack costs £38.
Much of the content Wolfe shares focuses on Holocaust denial. Nearly 6 million Jews died in the Holocaust, but he falsely claimed earlier this month that doctors had determined that “no one was deliberately killed by the Germans” and that “death was due solely to disease and starvation.”
It is unclear whether Drexler and Wolfe used their real identities to publish their material or wrote under pseudonyms.
Another account, Third Reich Literary Archive, with 2,100 subscribers, shared postcards purportedly from a Nazi propaganda rally held in Nuremberg in 1938, a year before the start of World War II. It also charges $80 per year for a premium subscription.
The Guardian account showed separate posts promoting conspiracy theories about Jewish power and influence and suggesting that antisemitism is a myth.
The algorithm also promoted other extremist content, including newsletters about the “great replacement” conspiracy theory (an alleged plot to replace white Europeans with people of other races).
There has been a sharp increase in antisemitism and Islamophobia since the start of the Israel-Gaza war in October 2023. In October last year, two people were killed in an attack on a synagogue in Manchester’s Heaton Park during the Jewish holiday Yom Kippur. In December, 15 people were shot dead during Hanukkah celebrations at Sydney’s Bondi Beach.
Danny Stone, chief executive of the Antisemitism Policy Foundation, said harmful online content often inspired real-life attacks.
Stone cited as an example the racially motivated killing of 10 African Americans in Buffalo, New York, in 2022; In 2018, one person died and many people were injured in the synagogue attack in Pittsburgh, Pennsylvania, in which 11 people died, and in the attack on a mosque in Finsbury Park, north of London, in 2017.
“People can and are inspired by online harm to do harm in the real world,” he said. “The terrorist who attacked the Heaton Park synagogue did not wake up one morning and decide to kill Jews; he will have become radicalized.
“Algorithmic redirects and the proliferation of harmful material are extremely serious. The Online Safety Act is supposed to address illegal content, but little is being done about so-called legal but harmful content.”
Stone also expressed concern about online disinformation about the Holocaust.
“There has been a decline in attendance and attendance at Holocaust commemoration events,” he said. “We know that knowledge is already frighteningly low.
“When you deny, reverse, or compare the Holocaust, you find that the memory of the Holocaust in general diminishes. As we move further away, as the number of survivors diminishes, truths can be lost.”
“We must win the battle for this narrative. This online content is extremely damaging because if we fail to learn from the past, we are doomed to repeat it.”
A spokesperson for the Holocaust Education Foundation said: “This type of material, which spreads conspiracy theories and Holocaust denial, praising Hitler and the Nazis, is not new, but its reach is clearly growing. The idea that Substack is profiting from this hateful material and allowing it to be promoted through its own algorithms is disgraceful.”
“We are acutely aware that time is moving us further away from the events of the Holocaust, and the number of eyewitnesses to this history is decreasing. At the same time, antisemitism is increasing; this extremism must be exposed, questioned and destroyed.”
Joani Reid, Labor chair of the all-party parliamentary group against antisemitism, said she planned to write to Substack and Ofcom to ask them to address the Guardian’s findings. He said antisemitism was “spreading with impunity” and was getting worse.
“We need to hold these tech companies accountable because this has real-life consequences,” he said. “Jewish people have been complaining about this for years; they say online violence will lead to offline violence, and that’s exactly what’s happening. We need to start taking these things much more seriously.”
Substack was contacted for comment but did not receive a response.
The platform, which launched in 2017, had previously been criticized for hosting newsletters supporting extremist views. Its co-founder, Hamish McKenzie, talked about the decision to host Nazi content in one of his own posts on the site in 2023.
“I just want to make it clear that we don’t like Nazis either; I wish no one else had these views,” he wrote. “But some people hold these and other extreme views. Given this, we do not think censorship (including demonetization of posts) will eliminate the problem; in fact, it makes it worse.
“We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to defeat the power of bad ideas. We are committed to supporting and protecting free speech, even when it is painful.”
McKenzie also said the site’s content rules “contain narrowly broad prohibitions, including a clause banning incitement to violence.”




