Publications
INSS Insight No. 1173, June 10, 2019

Over the past two years, democracies throughout the world have been subject to widespread concern over foreign efforts – mainly by Russia – to influence election campaigns, including in a series of elections held over the past several months. This article assesses the lessons of those elections in the context of the coming Israeli elections, in September 2019. It appears that following counter-measures mounted by media giants and countries themselves, there has been a reduction in attempted foreign interference in social media via bots. At the same time, there is a discernible increase in human activity to influence election results. Furthermore, influence campaigns over mainstream media are regaining a significant role, along with influence campaigns over instant-messaging apps, which enjoy a greater sense of credibility. At the same time, the two core challenges that were already evident in previous campaigns remain – difficulties in distinguishing external from internal influence campaigns, and difficulties in measuring their impact – with no significant progress toward a response.
The past several months have seen foreign attempts, attributed primarily to Russia, to interfere in sovereign states’ elections. The 2019 presidential election in Ukraine on March 31, 2019 was marked by widespread disinformation efforts by Russia, which since 2004 has used Ukraine as a laboratory to test disinformation tactics. Moscow’s purpose was to deepen rifts in that country and undermine public trust in the democratic process. At the outset of the recent election campaign, Russia found it hard to activate standard influence bot networks, due to curbs imposed by Facebook on using fake accounts. It therefore paid Ukrainian citizens for access to their accounts, in order to disseminate ads through them in support of pro-Russian candidates, as well as to publish false or incriminating information about the presidential candidates opposed to Moscow. The fake news that was disseminated dealt mainly with matters at the heart of the election: corruption of elected officials, quality of life in the country, and assessments of Ukraine's progress since the Euromaidan protests against pro-Russian rule that erupted in 2013 after the then-president of Ukraine refused to sign a cooperation pact with the European Union.
Finland has also been a target of Russian influence campaigns since it gained independence, and since the Russian annexation of the Crimean peninsula in 2014, its worries about being the next possible target have risen. In 2015, given a rise in accounts aligned with Russia planting disinformation rhetoric, the Finnish government enlisted American experts to advise its civil servants how to spot fake accounts, understand their virality, and develop a counter-strategy. As the April 15, 2019 election approached, the Finnish justice minister announced that given the actual danger of foreign interference in the election, adequate preparations were necessary. The main narrative that Russia tried to advance in Finland in the context of the election was the need to distrust European Union institutions and perhaps sever Finland from the EU. Another issue that was the target of much fake news was the country's immigration policy.
In mid-March, approximately one month before the parliamentary and presidential elections in Indonesia, home to the world's largest Muslim population, authorities faced a wave of cyberattacks launched by Russian and Chinese hackers, aimed to infiltrate the voter roster and use social media to influence the election outcome. The last attack involved a deluge of fake information designed to stoke social and political tensions through anti-Chinese messaging as well as extremist Islamic messaging. Not only were candidates a target for the dissemination of fake information; so too was the country’s General Elections Commission, which was accused of having 17.5 million fake profiles, and a shipment containing millions of ballots from China marked in favor of the incumbent president, Widodo – which prompted allegations of rigged elections after the results were made public.
Russia has also recently stepped up its efforts to interfere in elections throughout Africa, where election campaigns tend to be marked by violence that is easily intensified by false information. In this context, operatives were deployed to the continent in 2018 to set up an infrastructure for Russian interference in various countries where elections were due to be held through 2020 – including Nigeria, where election campaigns have been accompanied by much violence. False information is not a new phenomenon in this country, but it reached unprecedented heights toward the 2019 election held on February 23. The Russian operatives worked under the direct orchestration of Yevgeny Prigozhin, a confidant of President Vladimir Putin who was recently indicted by US Special Prosecutor Robert Mueller for organizing various acts of Russian interference in the United States 2016 presidential election. In keeping with the Russian modus operandi, the operatives were instructed to exploit the deepening political polarization in that country and stir up religious, political, and ethnic tensions. To that end, bot-reinforced fake accounts posing as Nigerian citizen accounts were established, and fake op-eds were published in the mainstream press. The bogus information touched on many issues – ranging from reports that incumbent President Muhammadu Buhari had died and was replaced by a Sudanese clone, to rumours that Kim Jong-un was interested in recolonizing Nigeria.
Additional Prigozhin-directed Russian activity in Africa was recently exposed in the context of the May 8 election in South Africa. Before the election, a disinformation campaign was launched against the two big opposition parties in the country, considered pro-Western. Planted Russian operatives worked as "political analysts" with the Association for Free Research and International Cooperation agency, which essentially serves as Prigozhin's operations center in Africa. The main tactics employed by the operatives are public rhetoric, production and dissemination of video clips presenting the ruling party in a positive light, and cooperation with journalists.
Given Russia's attempts to sway the results of the referendum held in Spain in October 2017 on the question of Catalonian independence, there was increased concern that similar attempts would also be mounted on the eve of the country's general election on April 28. That election constituted a testing ground for the media giants, as it took place just around a month before the European Parliament election and was an opportunity to check the effectiveness of defensive measures already in place to block attempted foreign influence over social media. Yet despite the many defensive efforts, millions of citizens were still exposed to WhatsApp messages claiming that incumbent Prime Minister Pedro Sanchez had signed an agreement granting Catalonia independence. Such reports were disseminated by Russian accounts, albeit on a limited scale, as well as by far right parties. A significant portion of the messages were also of an anti-LGBT and anti-immigrant nature. More limited activity was discerned on Facebook, originating with far right movements.
Russian attempts to influence political discourse in Australia occurred already between 2015 and 2017, following Australia's harsh response to Russia's alleged involvement in the downing of a Malaysian airliner over Ukraine in July 2014. During this period, many messages were planted in the social media with the aim of undermining support for the ruling party. Noteworthy during the country's recent May 18 election, however, was a peak in fake and divisive information disseminated over social media, especially by far right groups. This included the spread of false reports on Facebook and Twitter about a putative plan by the opposition Labour party to impose a "death tax" on inheritance – reports that also appeared in the mainstream media. False reports about immigration to the country were spread as well. The dispute over this issue intensified following the massacre in New Zealand in March by an Islamophobic citizen of Australia.
The European Parliament election of late May was also held amid concerns about attempted Russian influence campaigns via troll and bot activity over social media, and media giants, in collaboration with EU institutions, took precautions to ensure maximum protection of online discourse. However, most of the Russian activity in this case was carried out through traditional media. Thus, for example, the Sputnik news agency dedicated extensive coverage to the Yellow Vest protests that shook France. As for social media activity, a number of accounts found to be linked mainly to far right parties disseminated messages that were meant to deepen distrust in EU institutions and centrist parties, and called for limiting Muslim immigration to Europe. Britain, for example, saw the circulation of a sensationalist post that said "millions of Muslims want sharia law"; in Poland, reports were spread that immigrant taxi drivers were raping European women; and it was rumored in France that Notre Dame had been set ablaze by a Muslim terror group. These messages comport with the messages that were disseminated in traditional Russian media, which makes it difficult to point with certainty at a Russian hand in this context. That said, this possibility cannot be ruled out, as the Kremlin has a long tradition of supporting far right parties throughout Europe with a view of undermining the EU from within.
Conclusion
It is clear that in the wake of Russia's attempted intervention in the 2016 United States presidential election as well as in several European countries, global awareness of the possibility of foreign influence campaigns in elections has grown, resulting in increased preparedness to deflect such efforts. Throughout the world, preparations have been made to stop the phenomenon, by setting up ad hoc taskforces, calling on the public to exercise critical thinking, and cooperating with media giants (which have approached the matter relatively intensively). Still, it is clear that foreign intervention attempts have continued, with a significant change to their modus operandi. While indeed most of the influence campaigns were apparently carried out by Russia, the Russian model is liable to inspire other players and countries - for example China and Iran, as has already seen to be the case. As the next Israeli election approaches, the changes in the Russian model should be noted and the appropriate lessons learned.
While indeed there has been a noteworthy decrease in the activity of bots on social media, there has also been a noteworthy increase in the use of human influence agents to infuse online discourse with divisive content. Similarly, it appears that traditional media channels - RT and Sputnik, for example - are experiencing a renaissance and again serving as major platforms for Russian disinformation activity. In addition, influence campaigns function to a great degree over instant-messaging apps like WhatsApp, which allow a more efficient dissemination of disinformation compared to social media. On these closed platforms, information is relayed in relatively limited circles of friends and family, which lends the messages a sense of credibility. Furthermore, given end-to-end encryption technology that is typical of these platforms, even managers do not have access to the messages unless a user reports specific content as problematic. These characteristics make it hard to monitor and remove false information.
Despite the significant advances made in various democracies and platforms in terms of recognizing and safeguarding against the threat of foreign election interference, two main challenges remain in fighting the phenomenon: difficulty in tracing the source of the information and determining whether it is foreign-influence or domestic-legitimate in nature, given that internal and external players frequently echo each other; and difficulty in gauging the actual success of foreign influence campaigns in swaying political discourse. Developing a response to these challenges will constitute a significant step in fighting the phenomenon and is likely to reduce it measurably.