Tag Archives: Russia

Workshopping trust and speech at EDMO

It was a great pleasure to convene a workshop at the European Digital Media Observatory today featuring Claire Wardle (Brown), Craig Matasick (OECD), Daniela Stockmann (Hertie), Kalypso Nicolaidis (Oxford), Lisa Ginsborg (EUI), Emma Briant (Bard) and (briefly) Alicia Wanless (Carnegie Endowment for International Peace). The title was “Information flows and institutional reputation: leveraging social trust in times of crisis” and the far-ranging discussion touched on disinformation, trust vs. trustworthiness, different models of content moderation, institutional design, preemptive red-teaming of policies, algorithmic amplification, and the successes and limits of multi-stakeholder frameworks. A very productive meeting, with more to come in the near future on this front.

Future publishing on disinformation

My chapter abstract entitled “Censorship Choices and the Legitimacy Challenge: Leveraging Institutional Trustworthiness in Crisis Situations” has been accepted for publication in the volume Defending Democracy in the Digital Age, edited by Scott Shackelford (of Indiana University) et al., to appear with Cambridge UP in 2024.

In other news, I am writing a book review of the very interesting grassroots study by Francesca Tripodi entitled The Propagandists’ Playbook: How Conservative Elites Manipulate Search and Threaten Democracy (Yale UP) for the Italian journal Etnografia e Ricerca Qualitativa.

Russian pre-electoral disinformation in Italy

An interesting blog post by the Institute for Strategic Dialogue discusses Russian propaganda in the run-up to the recent Italian general elections.

Basically, the study identifies 500 Twitter accounts of super-sharers of Russian propaganda in Italian and plots their sentiments with regard to party politics, the conflict in Ukraine, and health/pandemic-response policy during the electoral campaign. This is not, therefore, a network of coordinated inauthentic behavior, but rather a bona fide consumption of Russian propaganda.

There are some interesting takeaways from the data, the main one being the catalyst function of coverage of the Covid-19 response: a significant proportion of users in the group began sharing content from Russian propaganda websites in the context of vaccine hesitancy and resistance to public health measures such as the “green pass“, and then stayed on for Ukraine and Italian election news.

What remains unclear, however, is the extent of the influence in question. The examples given of Kremlin-friendly messages hardly suggest viewpoints without grassroots support in the country: it is fairly easy, for instance, to find the same arguments voiced by mainstream news outlets without any suspicion of collusion with Russia. Also, the analysis of candidate valence does not support the conclusion of a successful misinformation campaign: the eventual winner of the election, Giorgia Meloni, comes in for similar amounts of opprobrium as the liberal establishment Partito Democratico, while the two major parties portrayed in a positive light, Matteo Salvini’s Lega and the 5 Star Movement, were punished at the polls. Perhaps the aspect of the political views of the group that was most attuned to the mood of the electorate was a generalized skepticism of the entire process: #iononvoto (#IDontVote) was a prominent hashtag among these users, and in the end more than a third of eligible voters did just that on September 25th (turnout was down 9% from the 2018 elections). But, again, antipolitical sentiment has deep roots in Italian political culture, well beyond what can be ascribed to Russian meddling.

In the end, faced with the evidence presented by the ISD study one is left with some doubt regarding the direction of causation: were RT and the other Kremlin-friendly outlets steering the political beliefs of users and thus influencing Italian public discourse, or were they merely providing content in agreement with what these users already believed, in order to increase their readership?

Geopolitical splintering, decentralization, impartiality

Meta and Twitter have discovered and dismantled a network of coordinated inauthentic behavior spreading pro-US (and anti-China/Russia/Iran) narratives in Central Asia and the Middle East (Al Jazeera, Axios stories). Undoubtedly, this kind of intervention bolsters the platforms’ image as neutral purveyors of information and entertainment, determined to enforce the rules of the game no matter what the ideological flavor of the transgression may be. In a way, paradoxically, such impartiality may even play well in Washington, where the companies would certainly welcome the support, given the current unfavorable political climate.

The type of universalism on display in this instance harkens back to an earlier era of the internet, the techno-libertarian heyday of the 1990s. Arguably, however, that early globalist vision of the world-wide web has already been eviscerated at the infrastructural level, with the growth of distinctive national versions of online life, in a long-term process that has only been made more visible by the conflict in Ukraine. Hence, the impartiality and universality of Meta and Twitter can be seen ultimately as an internal claim by and for the West, since users in countries like Russia, China, or Iran are unable to access these platforms in the first place. Of course, geopolitical splintering was one of the ills the web3 movement set out to counter. How much decentralization can resist the prevailing ideological headwinds, however, is increasingly unclear. Imperfect universalisms will have to suffice for the foreseeable future.

Disinformation isn’t Destiny

As the war in Ukraine enters its sixth week, it may prove helpful to look back on an early assessment of the informational sphere of the conflict, the snapshot taken by Maria Giovanna Sessa of the EU Disinfo Lab on March 14th.

Sessa summed up her findings succintly:

Strategy-wise, malign actors mainly produce entirely fabricated content, while the most recurrent tactic to disinform is the use of decontexualised photos and videos, followed by content manipulation (doctored image or false subtitles). As evidence of the high level of polarisation, the same narratives have been exploited to serve either pro-Ukrainian or pro-Russian messages.

This general picture, by most all accounts, largely holds half a month later. The styles of disinformation campaigns have not morphed significantly, although (as Sessa predicted) there has been a shift to weaponize the refugee angle of the crisis.

Most observers have been struck overall by the failure of the Russians to replicate previous information successes. The significant resources allotted from the very beginning of the conflict to fact-checking and debunking by a series of actors, public and private, in Western countries are part of the explanation for this outcome. More broady, however, it may be the case that Russian tactics in this arena have lost the advantage of surprise, so that as the informational sphere becomes more central to strategic power competition, relative capabilities revert to the mean of the general balance of power.