Saturday , 20 April 2024

Eight Takeaways From Iranian Information Operations

afcea – Russia may have popularized the manipulation of social media to further its own agenda, but it was not the first country to do so, nor will it be the last. A number of other countries are engaging in similar tactics, but so far have flown largely under the radar. The Oxford Internet Institute found that at least 28 countries worldwide are exploiting social media to influence the public opinion of their own or foreign populations.

It isn’t surprising that the number of nations weaponizing social networks to achieve their foreign policy objectives is increasing at an alarming rate. The growth of platforms such as Facebook, Twitter and Instagram has lowered state and nonstate actors’ barrier to entry into any country’s information environment. It makes digital information operations a low-cost, low-risk, high-impact means of attack.

In the past two years, major social media companies have taken down tens of thousands of inauthentic accounts, pages, channels and groups run by actors linked to Russia and Iran alone. Whereas Russian information campaigns have been scrutinized thoroughly by both experts and the public, Iranian information operations have not warranted the same levels of attention. This has left a glaring gap in understanding Iran’s digital influencing capabilities, which likely match those of Russia-linked actors.

Over the past year, the Atlantic Council’s Digital Forensic Research Lab (DFR Lab) analyzed half a dozen Iran-linked influence campaigns taken down by Facebook and Twitter, exposing a side of Iran’s information operations never seen before. The research revealed eight common threads that demonstrate the depth and breadth of Iran’s covert digital capabilities.

First, the research showed that Iran was an early mover in digital information operations on social media. During the most recent takedown of Facebook pages linked to Iran in January 2019, DFR Lab found that at least 30 out of 97 pages analyzed by the lab had been active since 2014. Of those, two were created in 2010 and another five in 2011. All but two of the pages created between 2010 and 2011 targeted foreign audiences, specifically in Saudi Arabia, Egypt, the United States and Bahrain.

In contrast, although Russia’s Internet Research Agency (IRA) started its activities in 2009, it was not until 2013 that it began targeting audiences outside of Russia. This indicates that Iran was an early mover in this space and was likely targeting foreign countries years before any Russia-linked actors did.

Second, the study revealed that Iran relies on authentic user-generated content in its operations. Iran-linked information operations is recycled content. Rather than creating original material, Iran-linked operatives were using content that domestic actors such as activist groups in the United States or nongovernmental organizations in the West Bank and published it as their own. As a result, most of the content social media accounts and pages linked to Iran came from left-leaning activist groups, fringe media and blogs as well as various conspiracy sites.

The recycled content made Iran-linked information operations look more authentic and helped generate higher levels of engagement from their target audiences than the original content Iranian operatives produced.  

A third feature is that, similar to Russia-linked activity, Iranian information operations on social media are multilingual. The pages taken down in the past 12 months posted content in Arabic, Hindi, English, Indonesian, French, Farsi, Hebrew and Spanish. The countries the covert information operations targeted spanned four continents: North America, South America, Europe and Asia.

The vast majority of Iran-linked pages, however, targeted Iran itself, as well as Iran’s two main geopolitical foes, Israel and Saudi Arabia. This is similar to the Russian IRA’s activities, which initially focused exclusively on Russia, but later expanded its activities to other countries.  

Yet, Iranian operatives make mistakes. Although most of the content posted on Iran-linked pages was recycled from authentic users and was, therefore, grammatically correct, a fraction of the content was original and featured a number of writing mistakes. For example, a meme in English with two parallel images only made sense if read from right to left like one would in Farsi or Arabic. In another instance, a page misplaced a dollar sign in the phrase “secret 10$ bribe,” which would have been correct if it was written in Farsi.

Some of the original memes posted on social pages linked to Iran missed the tone of the memes. One example was the “laughing Republicans” meme, which is normally used to show the perceived hypocrisy of the Republican Party. One of the Iran-linked pages targeting the U.S. audience used the meme to ask a question if there should be a law banning Congress from “borrowing from Social Security again.”

Fifth, the DFR research found that Iran chooses sides. Unlike Russia-linked actors, who chose to target both sides of debates, Iran was more consistent with its messaging. Most Iranian information operations focused on issues important to Iran, namely the conflict in Syria, Israeli statehood, the Israeli-Palestinian conflict, U.S. foreign policy and Saudi Arabia’s war in Yemen. In all of these instances, the messaging was consistent with Iran’s foreign policy stances and shared content supporting Syria’s dictator Bashar al Assad, questioning Israel’s statehood, highlighting Palestinian suffering, and criticizing U.S. President Donald Trump’s foreign policy and Saudi Arabia’s war in Yemen.

In the United States specifically, Iran-linked pages shared anti-Republican, pro-Democrat and liberal content.

A sixth feature of Iranian social media influence is that a significant number of messages overlap with Russian social media content. Some of the narratives spread on social media pages and accounts linked to Iran were similar to those spread and amplified by IRA-linked assets. They focused on the conflict in Syria, the U.S. foreign policy and the Israeli-Palestinian conflict. Both IRA- and Iran-linked pages spread pro-Assad and anti-Israeli messaging and criticized the United States on a number of fronts, including its actions in Iraq and Afghanistan. This commonality, however, should not be interpreted as evidence of coordination but rather as a result of overlap in Tehran and Moscow’s foreign policy objectives.

Seventh, the study revealed that Iran targeted the U.S. elections. Although most of the pages linked to Iran focused almost exclusively on issues directly pertinent to Iran, some of the Facebook pages taken down in October 2018 were posting content related to the U.S. midterm elections. Most of it comprised copies of videos that Americans posted encouraging others to get out and vote. Those videos appeared to target Democratic voters and used the example of Trump’s presidency as a reason to go to the polls. This was consistent with the pro-Democrat stance of Iran-linked social media pages targeting American social media users.

With the exception of the midterms, Iran appears to be far less focused on election-related content than the Russia-linked operations.

And eighth, the DFR Lab research showed that Iran relies on artificial amplification to get traction for its information operations. Some of the Iran-linked pages the lab analyzed generated high levels of organic traffic; others had less organic-looking engagement levels and likely relied on artificial amplification to reach their target audiences, including bots and inorganic engagement farms. The artificial amplification was not limited to one social network and appeared to have been used on both Twitter and Facebook.

This approach closely resembles the tactic IRA-linked accounts used that also relied on artificial amplification to reach larger audiences.

Information operations have been a tool of warfare for centuries. What has changed is the platforms and speed at which information and misinformation can spread and perhaps more importantly who the cyber warriors are today.

Today, it can take up to a decade to recognize the threats from enemies using social platforms. Agencies and citizens must act to change this paradigm. First, governments must expand their understanding of the adversaries in this space. Second, they must learn to defend against them. And third, citizens of all nations need to be alert to how their opinions are being shaped by countries that have something to gain and manipulated into sharing that information through their own social accounts.

0