Disinformation 2.0: Should we bring the notion of propaganda back into public discourse?  · Global Voices
Olga Solovyeva

Image by Ameya Nagarajan
Disinformation is a buzzword and the source of significant disruption and conflict across the globe. Starting with the Donald Trump presidency — with its accusations of “fake news” — disrupted elections and inauthentic comments flooding social media discussions have become a commonplace, yet very irritating part of online reality. Significantly, throughout the COVID-19 pandemic, the unintentional spread of disinformation added a burden on health services when the rise of conspiracies was evident beyond the anti-vaccination movement. Governments, activists and scholars invest significant human and financial resources to study the phenomenon of disinformation and find ways to tackle it.
There are several arguments currently circulating about what disinformation is and how it works. The first set is attributed to the way social media and other IT platforms operate. Broadly, they argue that social media uses algorithms that favour extreme, divisive and emotionally charged content which often leads to radicalisation and fuels more misinformation. In the same vein, the specifics of data management create opportunities for targeted information campaigns. It is based on the access to large accumulated sets of personal information, which are used to analyse human behaviour and reach out to different religious, ethnic and racial groups. Such data is widely available for sale in the open and on the black market. Promoted posts and paid ads are the tools of targeted communication available across different IT platforms, including search engines, and they have a significant impact when applied to political information.
Another set of arguments is often less vocal, but likely more important. The global downfall of the public sphere in the information society is raising the level of mistrust in governments and institutions, which are now unable to provide complete and unbiased information about social processes. What Leah Lievrouw, professor in the Department of Information Studies at the University of California, Los Angeles, observed for Pew Research Center report on Future of Truth and Misinformation online back in 2017 is still relevant today:
So many players and interests see online information as a uniquely powerful shaper of individual action and public opinion in ways that serve their economic or political interests (marketing, politics, education, scientific controversies, community identity and solidarity, behavioral ‘nudging,’ etc.). These very diverse players would likely oppose (or try to subvert) technological or policy interventions or other attempts to insure the quality, and especially the disinterestedness, of information.
At the same time, watchdogs such as journalists and the non-profit sector may not follow public interest, being instead linked to the commodification of information in the information economy. The lack of institutionalised deliberation forums pushes people to social media, which has become an almost exclusive platform for participating in social and political life, as well as the place fostering social polarization, xenophobia and the spreading of false news.
Yet, it is not only the problems of the civic context that may amplify the disinformation spread but the citizens’ role in consuming and sharing information. Research on the psychology of misinformation highlights that people, on average, are neither incapable of distinguishing false information from genuine nor willing to share misleading information. One of the drivers that affect the spread of misinformation that scholars acknowledge is the phenomenon of  “lazy thinking” — the pattern of thinking that demands less cognitive effort. People, as they are, tend to follow their emotions and intuition when consuming and sharing content, and make their decisions based not only on the content itself but the metadata of the media item, such as the assumed authenticity of the author of the publication or the number of engagements.
The disinformation apocalypse today is driven by a mixture of these factors: the degrading political context, the lack of effort from IT platforms and the simple fact of human nature. There is strong concern that even a complex misinformation campaign relying on success in all three directions could encourage a negative attitude change among the people. This is despite the fact that IT professionals dealing with online influence campaigns or marketing report that, on average, the impact of targeted advertising in political and marketing communication has a small effect on targeted groups. In other words, a targeted advertising campaign can convince a person with an already established attitude but is not enough to make a radical change in the way an individual is thinking. In their book Network Propaganda, Benkler, Faris and Roberts suggest that while observing data manipulation and online spread of disinformation is important, it’s worth remembering the impact it has, and Russian disinformation efforts had very little impact, yet they have exploited the existing conflicts within the American society.
The recent spike of disinformation around the war in Ukraine highlights an interesting pattern. While the Ukrainian information campaign appears successful in delivering a strong and influential agenda in Western countries, Russian disinformation is targeting the rest of the world, including BRICS countries, Asia and Africa. Contrary to the narratives of the Ukrainian side aiming to raise awareness of war crimes happening there or to demonstrate the strength of Ukrainian resistance, the Russian disinformation campaign is run in a scattered way, spreading messages that implicitly aim to resonate with existing attitudes in the targeted population.
As Carl Miller, director of the Centre for the Analysis of Social Media at the Demos think tank in London, claims in his piece for The Atlantic:
Disinformation campaigns are far more effective when they have a powerful truth at their core and use that truth to guide discussion. The blunt reality is that in many parts of the world, antipathy for the West is deep and sympathy for Russia is real. It is in these contexts where I’d expect influence operations to be targeted—and to work.
His team's recent research analysing the corpus of posts spread with the hashtags #istandwithputin and #istandwithrussia highlights the circulation of narratives of western hypocrisy, NATO expansionism and BRICS solidarity in the selected regions.
People have asked for some more examples of the messaging. Whilst i don't like to amplify, it is important to show the rhetorical positioning that's being used here. pic.twitter.com/SL2UZjc6KA
— Carl Miller (@carljackmiller) March 19, 2022
Despite the fact that Russia’s digital disinformation strategy is driven and delivered by modern IT tools, in essence, it is similar to propaganda techniques. The aim is to legitimise certain narratives by injecting them into the media ecosystem and then repeating them so they become the new common sense for the population. It is amplified by internet technology, allowing for the production of content that is fake, but credible to the inexperienced or inattentive, that mimics reality.
Despite the visible efficiency of disinformation, the audience must be prepared for an attitude change to take place. This can be a result of what Jacques Ellul conceptualised back in the 1960s as pre-propaganda — the conditioning of minds with vast amounts of incoherent information, already dispensed for ulterior purposes and posing as “facts” and as “education,” which “without direct or noticeable aggression is limited to creating ambiguities, reducing prejudices, and spreading images, apparently without purpose.” Pre-propaganda essentially becomes the underlying goal of Kremlin geopolitical information campaigns, as Russian trolls adopt the cold-war techniques: from the disruption of the 2016 US elections to the case justifying the Russian invasion of Ukraine.
The explanatory power of the term “pre-propaganda” becomes more evident compared to the overused term of disinformation. Many information manipulation campaigns become more sophisticated and dispersed, aiming to have a psychological effect on the audience by creating an alternate picture of reality. Disinformation is just one of the elements producing an informational impact on societal imagination structured around dominant narratives and ideas.
Returning to the case of Russia, the reaction of the general public that wholeheartedly supported the invasion, despite having a strong negative attitude towards war, must not be neglected. This is strong evidence of domestic propaganda that used a lot of disinformation to inoculate public attitudes over time. In these cases, and in the cases coming, focusing on disinformation per se, and targeting fake news may not be enough to prevent political disruption and conflict.