The explosion in interest about fake news sites on Facebook raises an obvious question that few have explored: Could Russia, in light of accusations that they sponsored hacks of the Democratic National Committee (DNC) and then made the material available to Wikileaks to undermine the Clinton campaign, also be behind propaganda that has promoted Donald Trump and various far-right narratives about the US? Or did the disinformation about the American election come from a mix of domestic political and economic motives without any connection to Russia?
Considering Russia’s recent history of successful online disinformation campaigns in Europe, one can imagine that “troll armies” were tasked to the American front, as they’ve been elsewhere. Russian state-owned media, paid “troll armies,” and hacking collectives held at arm’s length have been doing just this in European countries for almost 2 years now, ever since the Ukrainian Civil War began. Germany, Sweden, Finland, the Czech Republic and Slovakia, Ukraine, France: the list goes on.
Here, we analyze how fake news has become such a problem, talk to experts who suspect and doubt Russia’s involvement in disinformation leading up to the American presidential election, and summarize the evidence we have.
Echo chambers of fake news
The world’s two largest media purveyors, Facebook and Google, are now implicitly acknowledging their algorithms might have pushed disinformation this year that influenced the US presidential election. On Monday, they separately announced that they would block fake news websites from advertising on their networks.
Facebook updated its Audience Network Policy to read, “Don’t integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, or that promotes . . . fake news or anything that falls within any other categories that are prohibited by the Facebook Community Standards.”
Google also announced a new policy, hoping to starve those websites of ad revenue that on the face of it helps those sites operate. “Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property,” Google’s Andrea Faville told Reuters.
This will hardly be of comfort to many people though, as some say the damage has already been done: Facebook allowed disinformation and propaganda to spread across it throughout the election cycle. (Twitter, too.)
Mark Zuckerberg brazenly dismissed the idea as “pretty crazy” several days ago in the face of massive criticism, both outside his company and within it as well, where there is now an unofficial task force of employees trying to figure out what went wrong and what can be done.
Past solutions offered were shot down for fears of pushback from different constituencies using the site, but full automation has proven to be a disaster. At least the main source of that disinformation this election cycle appears to have a geological epicenter that can be traced, to Macedonia.
A team of teens, as first reported by BuzzFeed, had launched over 140 fake news websites in the run-up to the election including WorldPoliticus.com, TrumpVision365.com, USConservativeToday.com, DonaldTrumpNews.co, and USADailyPolitics.com. The Guardian counts 150 sites being run out of Veles, Macedonia. (According to PRI, they got their start with fake online health news before turning to more lucrative fake political copy.)
Those behind the sites told BuzzFeed they were after ad revenue from American clicks, having learned how to game the site’s algorithms and curate thousands of fake profiles at a time to boost the signal. The same algorithms that Zuckerberg insists had no impact on the election, yet are also somehow effective enough to justify charging publishers, brands, and even political campaigns high fees to reach audiences.
“Yes, the info in the blogs is bad, false, and misleading but the rationale is that ‘if it gets the people to click on it and engage, then use it,’” a source, who spoke on condition of anonymity, told BuzzFeed two weeks ago.
But considering both the scale of the operation and just how much it has in common with other political disinformation campaigns that have been conducted around the world lately, it is not all just profiteering hoaxers or trolls trolling for the sake of the troll game.
The potential evidence Russia is behind fake news in the US is extensive
An exposé back in May by Bloomberg profiled one Andrés Sepúlveda who, working out of Bogota, Colombia, had taken between $12-20,000 per month in retainer fees by political consultants to launch disinformation campaigns for candidates in Latin American elections.
So when Bloomberg asked Sepúlveda if he thought the US campaign was seeing the same, he said simply, “I’m 100 percent sure it is.”
For many in the US Intelligence Community, this means the Kremlin. The Russian government has conducted similar disinformation campaigns against Western countries with real political consequences.
While the motive reported so far for sites in Macedonia has been generating click-based revenue, it is also true that many trolls generate revenue not by tricking bots, but by getting paid by an actual agency to flood the internet with fake news and run many fake sock-puppet accounts. They do this both for Russia’s own domestic propaganda efforts, and for foreign targets like the US and UK.
As Sweden debated a new military alliance with NATO last year, the country was suddenly flooded with misinformation that NATO would start stockpiling all its nuclear weapons there, or that Sweden would not be able to prevent NATO from using the country as a staging ground to launch a war with Russia.
There were even reports that NATO troops would be immune from prosecution were they to commit crimes against Swedish citizens. Experts and Swedish government officials have pointed the finger at Moscow for poisoning the well of public discourse on the issue.
“What astonished people here is it was playing to a far right narrative, spreading in the same kind of circles as normal far right propaganda,” Anders Lindberg of Swedish left-wing paper Aftonbladet told Geektime, a phenomenon that is becoming increasingly apparent even in the public agendas of Russian actors, and predates anything involving Trump, mainstream conservative, or the “alt-right” in America.
Lindberg explains that Sweden first started to see a steady flow of questionable information in 2013, around the time the US threatened military intervention against Russia’s ally, Syria. Then it exploded with the conflict in Ukraine. But a number of popular defense bloggers called out the rumors, setting off a public debate about misinformation that was creeping into mainstream media in Stockholm.
The Russian government also engaged in a disinformation campaign to shift the blame for Russian-sponsored rebels shooting down Malaysia Airlines Flight 17 over eastern Ukraine in 2014, putting out one “new” exclusive account after another that purported to explain the crash away as Ukraine’s fault.
It was a scene to watch, as stories changed so rapidly, peaking whenever officials from the EU, US, or Ukraine were about to announce a new report on their own ongoing investigation. (Which, unlike the many different accounts offered to exonerate the pro-Russian rebels, has been consistent.)
One Ukrainian website, Stop Fake, emerged in the wake of these incidents to help target false information coming both from established news sources and rumor mills online. The footprint was there for Ukraine, linking back to Russia, as it was so heavily stamped into the discourse by Russian officials and traditional media alongside the trolls.
It gets easier to detect over time
“I am sure that there are a lot of centers, some linked to the state, that are involved in inventing these kinds of fake stories,” former Kremlin information officer Gleb Pavlovsky told the The New York Times in August. But saturation occurred rather quickly.
When the Swedish Security Service declassified parts of a report warning about Russian disinformation, Swedish media sites were suddenly hit by a number of DDoS attacks, explains Marcin Andrzej Piotrowski of Switzerland’s Center for Security Studies.
It is now a consensus among many observers that Russia was pushing the misleading information, but the proliferation of hoax news sites would mark a shift in tactics if indeed the Kremlin was directly encouraging or financing these URLs.
“I think this is a new phenomenon. I think if they came with something like that today, no one would believe any of it,” thinks Lindberg, saying Swedes are too aware of Russian disinformation efforts at this point: “We’re two years ahead of the US in that sense. They can destabilize, but won’t be able to spread completely fake stories.”
In fact, he says Russian trolls were more effective on Twitter than on Facebook between 2013 and 2015, something that correlates strongly with an explosion in right-wing bots exploding onto the scene. Pro-Trump trolling on Twitter in 2016 continued this trend, taking it to new heights. Trump’s Twitter mentions were far higher than Clinton following their debates, numbers the BBC suggested were “swelled by bots.” Indeed, while both candidates’ accounts had many fake followers, no surprise given their celebrity, the machines counted for a larger share of Trump’s sphere and actually drowned out the Clinton ones.
From Hungary to Romania
Evidence of Russian fingerprints on media is perhaps better supplied within Europe itself, for operations against specific EU and NATO member states. The Finnish Institute of International Affairs’s Katri Pynnöniemi and András Rácz released a report in May analyzing Russia’s use of disinformation country to country, mainly in Eastern and Northern Europe. They recount Russia’s use of information websites and social media groups, as able to reach Western readers with propaganda in ways the Soviets could not in the pre-internet era.
Pynnöniemi told Geektime, “Yes, influence-agents in different parts of the world working directly or indirectly for Russia take advantage of the ‘news sites’ that distribute fake news, etc.”
Rácz told Geektime in his view that Russia was “definitely” pointing to Hungary. “Various ‘alternative’ news websites, as well as pages transferring blatant Russian propaganda play an integral role in Russian information warfare efforts vis-a-vis Hungary, particularly because neither Sputnik, nor RT has broadcast in Hungarian. There are both websites and Facebook-groups, several dozens of them.”
He says the most obvious example was Hidfo, which started off as a far-right, nationalist mouthpiece. Then it started to pivot toward a pro-Russian slant before shifting from a .hu domain to a .ru domain and server. The site was tangled in a “massive scandal” for publishing fake news stories and pictures, including the unfounded charge that Hungary was supplying the Ukrainian Army with tanks to fight pro-Russian rebels. A report by Atlatszo — the Hungarian equivalent to Snopes — recounts that trail of disinformation aimed to sitr up resentment against an intervention that didn’t exist.
“Though exactly measuring the effect of these Facebook and Twitter networks operating in favor of Russia is not easy, from the number of shares and likes one can define that the ‘news’ they publish often reach an audience of several thousand. In a country of less than ten million, this is not an insignificant reach,” Rácz notes.
Russia has been playing off the narratives of euroskepticism prevalent in countries like Greece and Hungary to feed a pro-Russian narrative to audiences willing to listen to things that undermine confidence in the leadership of the European Union. But some countries aren’t seeing as severe an anti-EU trend, such as Romania.
“Given the pro-Western context of Romanian population and observing the specific impact of social media in the country,” explained SpaceX engineere “online reading public, and especially the Facebook accounts.”
The reason is simple: it spreads fast. With the right combination of incendiary headlines and a bit of a boost, these things fly.
“The mix of information is first published on the hundreds of websites and online blogs, and then propagated through Facebook,” Marcu and Rosu go on, describing a strategy that mirrors the approach taken by fake news websites proliferating on American Facebook feeds, posts which use “bombastic formulations meant to capture the attention of the browsing public. In this manner, the websites propagating the intended disinformation are recording a significant number of hits.”
The real indicator, as argued by Marcu and Rosu, is the extreme anti-NATO (and anti-American) slant to the stories that is designed to scare readers into thinking deeper ties with NATO or Western countries in general makes their country more insecure or unable to decide their own fates, as was the case with Sweden.
Making a strong connection to Russia is still very difficult. These are all the reasons that this is so
Drawing a straight line to Russia in the case of US sites, however, is not a simple task.
“The problem here is that one must differentiate between motives,” Jim Kovpak, who has contributed to and blogged for Stop Fake in the past, explained to Geektime. Kovpak is based in Moscow, where he has written for The Moscow Times, Russia! Magazine, Kyiv Post, and Open Democracy. He also runs the Russia without BS blog, where he routinely tackles Russian propaganda’s effectiveness and the question of why such fake news is so appealing to so many. “Russian state media has certain political goals. American fake news sites are almost certainly going to be profit-driven.”
“Alex Jones (Info Wars), for example, is basically a shady businessman selling snake oil,” Kovpak opines, referring the conspiracy theorist who has been online for more than a decade and aggressively backed Donald Trump during the campaign. “I also think that there’s a big ego-boost for people like Jones, David Avocado Wolfe, and Food Babe, for example.”
All things considered, “It is certainly possible that there was a Russian connection behind the Macedonian site. Russia has a need to maintain plausible deniability, they definitely have interfered in Balkan countries before, and it also would probably be more cost-effective for them to outsource such an operation.” (Indeed, the hacker Guccifer 2.0, who became a figure in the 2016 US elections, presents himself as Romanian with far less evidence for that identity, than for his probable Russian origins.)
“That being said,” Kovpak concludes, “I don’t think we can say the Russians were involved without better evidence.”
The profit motive is, troll factories aside, enough to draw regular individuals in looking for easy cash. The Washington Post shows just how lucrative such fakery can be in any context. The Macedonians are earning pennies, really, compared to some operators.
One of the most popular American purveyors of nonsense stories and satires, Paul Horner, estimates he earns $10,000 a month from Google AdSense. Ironically, in light of how widely shared his content was among Trump fans, he professes to detest the man and to write stories as badly as possible in the hopes they will embarrass the people sharing them.
Horner’s main observation about the rise of fake news could explain more about the disinformation — and is a truth we have digested over and over again during the election and since — than Russia’s potential tie. “Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary.”
How does disinformation work anyway?
While specific campaigns are undertaken to link to issues that benefit the organizers directly, like in Russia’s scaring public opinion away from wanting to back the overthrow Bashar al-Assad in Syria or join a missile defense shield in Poland, these are limited objectives. And not very different from longstanding influence techniques in the media.
So, what, then, would explain a profusion of pieces about hate crimes that didn’t happen, celebrity endorsements that were never made, and murder-suicides of fake people in fake towns? Who benefits from these?
These have no direct bearing on anything the Russian government does, or wants, in its relations with the US. Or China, or Iran, or any other nation directing propaganda for its own policy ends.
So who benefits from these, how, and why?
In short, confusion is the end goal.
The profusion of such stories, though, as chronicled by Adrien Chen when he went to Russia to report on troll factories and was entrapped in the process by neo-Nazis to try and kill his story, is premised on the idea that public opinion in a target country is often too distracted by pressing issues at home about the economy or immigration, etc., to really care all that much about foreign policy.
Therefore, it makes sense to feed into those grievances to further divide the country. Fake stories about crime are a big draw here, or of people “disrespecting” some cherished quantity like Christmas or the US national anthem.
A divided nation splitting apart at home will, by this logic, be unable to muster political capital to go on foreign adventures, or stop others from doing so. If the issues do somehow connect back to Russian interests directly, like the handling of the Syrian Civil War, then so much the better.
This approach also assumes that since people don’t care about their own country’s foreign policy much, they will care even less about propaganda that tries to raise Russia’s own image up. That kind of boosterism is for domestic consumption, or at most émigré communities, like in the Baltic States. At best, it appeals to the far-right or ultra-left in certain countries, who look to Russia for their own political reasons.
But they have almost no real influence on policymaking, though, so are not giving Russia good value for its fake news money. As Peter Pomerantsev has written, on this “menace of unreality,” the real added value in information warfare is in spreading panic, confusion, and, especially, anger in your social media sphere.
All of the divisions fostered serve a purpose to weaken domestic political structures in the target countries. Besides poisoning the well abroad, Russia benefits doubly when the resulting disruption can be sold back home as “proof” of those countries’ systems being chaotic and weak. So while this idea is not new, the internet gives it new reach.
Still, it is difficult to ascertain exactly who is behind fake news sites and bots. Oxford University professor Philip Howard, who led a team of researchers in analyzing bots for Trump’s and Clinton’s campaigns, which generated 33% of pro-Trump traffic and 22% of pro-Clinton traffic, could not ascertain who exactly created the bots. He told CNN Money, “We don’t know who generates the bots — if it’s the campaigns, supporters, the candidates.”
Ultimately, it is challenging to separate plausible motives for Russian influence and the overwhelming fact that the US has, of its own doing, become extremely divided. Social media has enabled this, with liberal feeds and conservative feeds looking entirely different. And with the monetary incentives from advertising and sharing articles to like minded followers on social media, it is also easy to believe that the people behind fake news sites merely want to make a quick buck, get off on the sense of superiority that comes from fooling thousands of people for fun, or have other domestic political agendas.
The Philippines: A potential analogy to the US election
Given that domestic grievances do exist independent of coordinated disinformation campaigns, in short, it is difficult to measure their impact and to trace them back to Russia. The Philippines, for example, is a pretty good case study of how this content arises in a national context without any foreign inputs.
People got mad, lied online, and trolled all on their own. The big foreign power looking on in with hopes of President Rodrigo Duterte winning, China, barely played a role in the whole contest. Indeed, there isn’t even good evidence that China used its own vast troll armies to take sides in the election, as Russia has been accused of, with varying levels of hard evidence, through Europe and in the US over the past 3 years. Duterte came to his positions all by himself, over many years, and got a lot of people to go along with him there. That China benefitted from it was incidental, not the product of sponsored disinformation in Beijing’s favor.
All of that stuff was, apparently, farmed out by the new president’s supporters themselves.
What we know and smoking guns
The combination of allegations of Russian hacking during the election with DNC emails and solid evidence of Russian disinformation campaigns in Europe means that there may be a formal investigation of a link between Moscow and fake news websites in the US. Donald Trump’s ostensibly favorable attitudes toward Russia, the benefits the Kremlin might reap from a US administration seeking ever-closer ties on Syria, and coy remarks about campaign ties to Wikileaks only add fuel to the fire.
That does not, by any means, prove the Trump campaign had anything to do with these fake news websites, much less a Russian organized effort to promote him. Although Trump’s camp includes figures often associated with pro-Kremlin positions, many others are absolutely hostile to Vladimir Putin, to even the specter of Russian influence expanding anywhere in the world. And it did not take some secret operation for Trump to take unorthodox positions criticizing NATO countries, celebrating Brexit, or changing the mainstream Republican line on Syria and Ukraine.
Or as Kovpak also put it, “Trump is a symptom. Bullshit is the disease,” and it is not uniquely Russian or American. Many foreign operators now occupy the space. They are linked back to propaganda departments or spy agencies. Russia, China, Turkey, North and South Korea, Ukraine, the UK, Israel, and even the US have all made use of online disinformation campaigns in the past decade for various ends, some laudable (like counterterrorism) to outright blackmail and intimidation of dissidents.
Past Russian campaigns do mirror tactics seen during the US election: the prevalence of Twitter trolls and/or bots, the use of Facebook to spread alarming and misleading news, and the co-opting of far-right, ultra-left, and “alt-right” news sources to spread a narrative that favors Russian policy or a politician who would favor Moscow.
This is merely an astonishing amount of circumstantial evidence and coincidence, not proof. Yet, while we are still short of better evidence that Moscow facilitated such activity in any way, we are faced with an inescapable possibility that the US has fallen victim to noxious propaganda on an unprecedented scale. Reflecting the opinion of Andrés Sepúlveda in Latin America, Anders Lindberg sees no other conclusion than this is Russia’s hand.
“I think it’s quite obvious. This is how Russia acted in the Baltic states. This is how Russia acted in the Caucuses. This is how Russia acts,” opines Lindberg. “This is their modus operandi. If we look back at this election in five years, I would be surprised if the narrative is not ‘this is the election that Russia tried to influence.'”
Paul Mutter contributed reporting.