We want to make sure you get the best viewing experience for the content you are viewing.  Our goal is to improve each visit with data that creates this experience for you and those you share it with. We appreciate your continued readership.     

Facebook, Compromised: How Russia Manipulated U.S. Voters

Information Warfare: Russia’s “Active Measures.”

Conceptually, Information warfare is by no means a new concept. However, the broad reach of social media has created an entirely new and highly effective avenue for Russian ‘active measures’ to penetrate into and influence the minds of the American public.  Active measures “employ a three-pronged approach that attempts to shape foreign policy…state-to-people, People-To-People, and state-to-state…The Russian government today uses the state-to-people and people-to-people approaches on social media and the internet.”


This is the second instalment examining Russian information warfare, the use of social media, and the US election. Part one,Cold War 2.0: Russian Information Warfare, introduces the information warfare concept and its role in cyberspace.  

Third, The Trump Campaign’s Exploitation of Social Mediaexplains how the campaign benefited from Twitter bots, trolls, and microtargeted Facebook messages. 

Last, Cambridge Analytica: the Darker Side of Big Data, investigates the involvement of an ethically dubious “election management” firm in the 2016 presidential elections.


According to researchers who conducted a post-mortem of social media activity during the election using internet analytics tools, Russian Information Warfare content on social media attempts to subvert Western democracies in five ways: undermine public confidence in democratic government, exacerbate internal political divisions, erode trust in government, push the Russian agenda in foreign populations, and create confusion and distrust by blurring fact and fiction. Russian propaganda on social media can be divided into four themes: political messages intended to foster distrust in government (e.g. allegations of voter fraud, corruption), financial propaganda (i.e. create distrust in Western financial institutions), social issues (e.g. ethnic tensions, police brutality), and doomsday-style conspiracy theories.

Information warfare content is generated and disseminated through channels that fall into three attribution categories: white (overt), grey (less-overt), and black (covert) channels. They propagate a blend of authentic, manipulated, and fake stories and they feed off of and reinforce each other.

White or overt channels include state-sponsored pro-Russian news outlets such as Sputnik and RT, the grey less-overt outlets include data dump sites, such as Wikileaks, and more sinister black channels involve covert operations such as hacking. The agents disseminating the information include bots (automated web robots), and real people, often presenting themselves as innocuous news aggregators. These agents form the key engine for distributing misinformation and disinformation.

Black or covert measures—once highly risky and dangerous to carry out—are now easily and efficiently carried out through social media. Russia is now able to remotely coordinate an army of hackers, honeypots (in this instance, social media profiles used to bait other users into giving compromising or embarrassing information), and hecklers or internet trolls (individuals who purposely create discord or provoke).

The Role of Non-State Cyber Hackers: Advanced Persistent Threat Groups

Cyber hacking groups—or advanced persistent threat (APT) groups—are a critical component of the Kremlin’s information operations. The fact that it is challenging to definitively prove ties to the Russian government is what endears them to the Kremlin. However, while there isn’t necessarily a ‘smoking gun,’ evidence gleaned from previous cyber attacks has allowed the top US intelligence agencies to reach conclusions, with a strong degree of confidence, that the Kremlin was involved.

For example,  “the facts that the hackers’ work hours aligned with Moscow’s time zone, operations ceased on Russian holidays, their techniques carried signatures common to other Russian hacks, and their targets were of clear interest to Moscow.” In the social media realm, hackers provide the fodder for the narratives of disinformation/misinformation generated. “The most notorious Russian-linked hacker…Guccifer 2.0, targets current and former U.S. government officials, American security experts, and media personalities by seeking access to their private communications and records,” and whatever information may come to light then presents itself in the propaganda created and disseminated.

Honeypots

Honeypots are fake social media profiles which are designed to lure in real people to engage with them online: “today’s honeypots may include a component of sexual appeal or attraction, but they just as often appear to be people who share a target’s political views, obscure personal hobbies, or issues related to family history.”

The objective of the honeypot accounts is to earn the trust of unsuspecting users in order to conduct a range of activities including disseminating content from white and gray propaganda channels, attempting to entrap users with compromising propositions such as offers of sexual exchanges, or trying to persuade targets to click on malicious links or deceive people into downloading malware (software intended to damage a computer).

If the target exposed to a malicious link or malware is a person of interest, such as a politician or public figure, this enables APT groups to access personal information and post it on grey channels such as data dump sites. The information revealed in turn helps construct the narrative of misinformation posted on white channels, such as RT or Sputnik, and eventually trickling down to conservative news sites such as Breitbart, before being picked up by the mainstream media.

Hecklers: Trolls & Troll Farms

Hecklers, or trolls, give life to Russia’s influence operations. There have been reports of “troll farms,” employing hundreds of people, formed to actively disseminate pro-Kremlin propaganda. It is important to note, “the information contained in the comments and posts by the trolls ranges from misleading to verifiably fraudulent.” The objective of trolls is not necessarily to defend or validate the pro-Russian propaganda posted, but rather to flood the social media space with such a high volume of misinformation, as to create a state of confusion and calamity.

Senator Mark Warner, the ranking Democrat on the Senate Intelligence Committee, has said that “there were upwards of a thousand paid internet trolls working out of a facility in Russia, in effect taking over a series of computers which are then called a botnet, that can then generate news down to specific areas.” The implication here: a sophisticated and coordinated social media disinformation campaign was able to micro-target vulnerable voter populations. The reason they were vulnerable is that they received their news from social media, which had been powerfully harnessed to manipulate voters in the critical weeks leading up to Election Day.

The Ramifications of a Compromised Social Media Space

Social media, a Western innovation, at a glance seems like an ideal manifestation of a free and open society. Social media platforms enable users to share information, freely express opinions, and connect with other individuals. However, these same platforms were harnessed to wage a full-scale coordinated Information warfare offensive. False articles—“fake news” content—that favored Trump were four times as likely to be shared on social media platforms when compared with false stories favoring Secretary Clinton.

“Fabricated pro-Trump stories were shared four times as often as fabricated pro-Clinton stories…researchers also found that roughly half the readers of a fake news story believed it…automated Twitter accounts, known as “bots,” generated four tweets in favor of Trump for everyone in favor of Clinton…a substantial number of these bots were aligned with individuals and organizations supported, and sometimes funded, by the Kremlin.”

Russia utilized generations’ worth of acquired expertise in the art of Information warfare and adapted it to social media in a way that was agile, penetrating and efficient. There is evidence suggesting there were efforts to suppress voters in key precincts in Pennsylvania, Michigan, and Wisconsin.

These states, which were crucial in determining the winner of the Presidential election, were flooded with disinformation in the week leading up to the election. While it is difficult to conclusively demonstrate a causal relationship between the election results and Russian active measures targeted at these populations, it is highly likely, given that all three states voted Democrat in the past 5 Presidential elections.

Donald Trump, a fringe candidate with a radical platform, emerging victorious in these historically moderate voting districts, begs the question of what was the variable that impacted the election? The penetration of Russian Information warfare efforts, so effective due to the successful harnessing of social media, increasingly seems to be the culprit. However, the social media-facilitated assault on the democratic process had another devastating angle: the Trump campaign.


This is the second instalment examining Russian information warfare, the use of social media, and the US election. Part one,Cold War 2.0: Russian Information Warfare, introduces the information warfare concept and its role in cyberspace.  

Third, The Trump Campaign’s Exploitation of Social Mediaexplains how the campaign benefited from Twitter bots, trolls, and microtargeted Facebook messages. 

Last, Cambridge Analytica: the Darker Side of Big Data, investigates the involvement of an ethically dubious “election management” firm in the 2016 presidential elections.