Iranwire – The bizarre QAnon conspiracy theory flourished online throughout the pandemic, with proponents co-opting Covid-19 disinformation to fuel an ever-changing narrative.
Followers of this baseless conspiracy theory believe a cabal of global elites are secretly running an international child trafficking ring. Adherents have been linked to a number of violent incidents including January’s deadly attack on the US capitol.
But research by non-profit research group the Soufan Center, founded by the Lebanese-American former FBI agent Ali Soufan, reveals these bogus ideas aren’t just being pushed by domestic conspiracy theorists. Its data analysis found a significant proportion of QAnon-related Facebook posts were actually being amplified by foreign actors, including Russia, China and the Islamic Republic of Iran.
What is QAnon?
QAnon adherents, who come from all walks of life, believe a secret “deep state” of elites are actually cannibals involved in a child trafficking ring. Their “evidence” for these claims often come in the form of cryptic social media messages.
Although the conspiracy theory hinges on this supposed “cabal”, numerous adjacent beliefs feed into its ever-evolving narrative. As Covid-19 spread throughout the world last year, disinformation about the pandemic became a major theme in the group’s conversations. Followers are known for sharing anti-vaccine and anti-mask disinformation, or even disputing whether Covid-19 is real.
Soufan Center senior research fellow Jason Blazakis told Health Studio: “These were always issues that we knew, anecdotally, were of interest for the QAnon community.” But, he said, this disinformation can also resonate with people outside of QAnon groups, representing a “really significant public health challenge”.
What Role do Foreign Actors Play?
Although QAnon beliefs tend to centre on US-based platforms, research shows that many posts peddling the conspiracy theory have been amplified by actors linked to Russia, China, Iran and Saudi Arabia. This activity peaked around the time of major events in 2020, including the outbreak of coronavirus in the US and the US presidential election.
The Soufan Center and content science company Limbik analysed 166,820 QAnon-related Facebook posts from January 2020 to February 2021 to try to gauge their origins. They used an artificial intelligence model to determine how likely it was that a post had been foreign-influenced, using language analysis, location data and metadata.
Around 19 percent of posts in the period were thought to be foreign-influenced, rising to a higher proportion during major global events. In 2020, 44 percent of “foreign-influenced” posts appeared to have come from administrators in Russia and 42 percent from China. But from January 2021 to the end of February, the proportion linked to China grew to 58 percent.
“It makes a lot of sense to use QAnon as a tool to create a lot of havoc,” Blazakis told Health Studio. “It creates a distraction. It creates a lack of harmony within the country that can benefit powers that perceive the US to be an adversary.”
At the same time, a significant proportion – 13 percent – of QAnon conspiracy theory posts thought to be foreign-influenced were linked to Iran. In 2021, that proportion rose to one-fifth, something Blazakis says may reflect the state’s views on the new US administration.
Iran was previously blamed for sharing disinformation linked to the “Proud Boys” neo-fascist men’s rights organisation around the time of last year’s presidential election, targeting voters with threatening emails to influence the outcome.
Although Facebook has made efforts to quash the spread of QAnon-related disinformation, the Soufan Center argues that the social media giant has considerable work to do.
But while governments and social media companies have a big role in removing this kind of content, Blatzakis says it’s also important for social media users to understand how to spot disinformation online.
Like many Americans, he sees his contacts sharing content on social media they may not even realise is QAnon-related. It’s highly unlikely, he adds, that they know when this material is being amplified by a pro-Russian bot farm either.