Loading...
Global Challenges
Issue no. 13 | May 2023
The Global Disinformation Order
Global Challenges
Issue no. 13 | May 2023
The Global Disinformation Order | Article 3

Interpreting Disinformation

Reading time: 6 min

Writing in the 1920s and 1930s on subjects ranging from comparative history to propaganda and “false news”, Marc Bloch underscored the fact that knowledge is circumscribed by asymmetries in the manner in which it is generated. In an age in which political discourse is becoming increasingly strident, Bloch’s emphasis on analytic sophistication as the best line of defence against disinformation retains its relevance today.

In his first major manuscript, The Royal Touch: Sacred Monarchy and Scrofula in England and France (Rois et Thaumaturges: étude sur le caractère surnaturel attribué à la puissance royale, particulièrement en France et en Angleterre, 1924), the French historian Marc Bloch historicised the mystical underpinnings of medieval royal authority. The legend was that French and English monarchs – somewhere between the tenth and fifteenth centuries – held divine healing powers. Supposedly, those God-chosen kings only had to touch their subjects in order to cure a number of bodily afflictions, but especially a strain of tuberculosis known then as scrofula. Throngs of sufferers from all classes would come to seek this royal miracle. Apparently, surviving accounts corroborated the cure’s efficacy: “Public opinion was unanimous in affirming that great numbers of sufferers from scrofula had been healed by the kings” (Bloch 2015, 234).

The Royal Touch: Monarchy and Miracles in France and England by Marc Bloch (first edition 1924, first edition in English 1973).

For Bloch, the goal was not primarily disproving that the mythic healings happened in order to lay bare the superstitious nature of medieval mentalities. Nor was he trying to expose the manipulative will to power of medieval monarchs, eager to consolidate their authority through legerdemain. After all, there was evidence to suggest that the kings often believed in their own powers. Bloch was more interested in the rooted specificities of why the practice developed, attracted so many ardent believers, survived for so many centuries, and then faded away by the seventeenth century. For Bloch, as a founding member of the Annales school, which aspired to more scientific readings of the past, even miracles had to be taken seriously as historical objects. They were not worthy of study because they were necessarily true, but because they had important sense-making functions in their own epistemic contexts.

In the case of the royal touch, Bloch claimed, the rumoured healing power of kings grew not only from changes in relationships between ecclesiastical and political authority, but also from the very partial and imperfect medical knowledge of the period. This also helps explain why the royal touch came to be attached to specific diseases like scrofula, which were chronic, but also knew seemingly “mysterious” periods of remission. Miracles were a source of hope, but they were also a source of explanation for the hermetic behaviour of the pathogen itself. Why then did people living in the Middle Ages ask themselves about the inefficiency of the royal touch for more consistently terminal diseases? One answer is that belief in miracles also implied an acceptance of the mysterious ways of the higher powers, and their prerogative to judge about who lived or died. Thus, the miracle did not need to work “every time” for faith to crystalise as truth in the public consciousness.

By reaching into this distant past, Bloch offers us the cipher for decoding the transhistorical nature of the phenomenon we have come to call disinformation. The crux of Bloch’s story is not so much that truth is inherently relative and subjective, but more that all knowledge is circumscribed by asymmetries of power, resources, access, lived experiences, psychological or analytical propensities, and pre-existing beliefs.The crux of Bloch’s story is not so much that truth is inherently relative and subjective, but more that all knowledge is circumscribed by asymmetries of power, resources, access, lived experiences, psychological or analytical propensities, and pre-existing beliefs  Those asymmetries impact both how knowledge is produced, and also how it is interpreted and consumed. Understanding the dynamics underpinning all manner of “false claims” requires looking as carefully as Bloch did at the phenomenon of the royal touch, taking into account the relevant political, economic, cultural and psychological constellations.

Perhaps unsurprisingly, while Bloch was still writing the Royal Touch, he also wrote a short piece on the problem of false news (Réflexions d’un historien sur les fausses nouvelles de la guerre, 1921).

Réflexions d'un historien sur les fausses nouvelles de la guerre, Marc Bloch, Allia Edition (first edition 1921).

There Bloch stated: “Items of false news, in all the multiplicity of their forms – simple gossip, deceptions, legends – have filled the life of humanity. . . . How do they propagate themselves, gaining strength as they pass from mouth to mouth or writing to writing?” (Bloch, 2013, 2). Later, he gave an answer: “The error propagates itself, grows, and ultimately survives only on one condition – that it finds a favorable cultural broth in the society where it is spreading. Through it, people unconsciously express all their prejudices, hatreds, fears, all their strong emotions” (Bloch 2013, 3).

If Bloch were with us today, he might argue (as I have argued elsewhere) that mere “fact checking” is not necessarily a sufficient cure for the ills of disinformation (Biltoft 2020).Perhaps, then, the first line of defence against disinformation is also the slowest and most difficult to secure in this age of digital immediacy – analytic sophistication  After all, people also reach for concepts that reaffirm their own worldviews, meet their need for existential certainty, and satisfy their longings for power and prestige (Tetlock 2002). Perhaps, then, the first line of defence against disinformation is also the slowest and most difficult to secure in this age of digital immediacy – analytic sophistication. Bloch was such a believer in the principles of both free and fundamentally rigorous thought that he fought and even died for them – at the hands of a Nazi firing squad, in 1944.

Carolyn N. Biltoft
Associate Professor, International History and Politics
Geneva Graduate Institute

Share this article:

References

References

Biltoft, Carolyn. 2020. “The Anatomy of Credulity and Incredulity: A Hermeneutics of Misinformation.” Harvard Kennedy School Misinformation Review 1 (2), January 2020.

Bloch, Marc. 2013. “Reflections of a Historian on the False News of the War.” Translation by James P. Holoka. Michigan War Studies Review, 1 July 2013.

Bloch, Marc. 2015. The Royal Touch: Sacred Monarchy and Scrofula in England and France. London: Routledge.

Tetlock, P. E. (2002). “Social Functionalist Frameworks for Judgment and Choice: Intuitive Politicians, Theologians, and Prosecutors.” Psychological Review 109 (3): 451–71.

Loading data from Google Spreadsheets...

Source: Oxford Internet Institute

PODCAST: The Right of Freedom in Armed Conflicts: A View from the United Nations, with Paige Morrow

RO Geneva Graduate Institute.

PODCAST: The link between political message and social context, with Michelle Weitzel

RO, Geneva Graduate Institute

DEFINITIONS | Some terms in the world of disinformation

Astroturfing

An unfair practice of propaganda and manipulation used in the media and particularly on the internet, consisting of giving the impression of a mass phenomenon that emerges spontaneously when in reality it has been created from scratch to influence public opinion. (Source: Wiktionnaire, s.v. “astroturfing”.)

Computational propaganda

Computational propaganda involves the “use of algorithms, automation, and human curation to purposefully distribute misleading information over social media networks” (Woolley & Howard 2018). While propaganda has existed throughout human history, the rise of digital technologies and social media platforms have brought new dimensions to this practice (Source: Programme on Democracy & Technology of the Oxford Internet Institute, “What Is Computational Propaganda?”.)

Conspiracy theory

A conspiracy theory is an explanation for an event or situation that asserts the existence of a conspiracy by powerful and sinister groups, often political in motivation, when other explanations are more probable. The term generally has a negative connotation, implying that the appeal of a conspiracy theory is based in prejudice, emotional conviction, or insufficient evidence. A conspiracy theory is distinct from a conspiracy; it refers to a hypothesised conspiracy with specific characteristics, including, but not limited to, opposition to the mainstream consensus among those who are qualified to evaluate its accuracy, such as scientists or historians. (Source: Wikipedia, “Conspiracy Theory”.)

Black propaganda

Black propaganda is intended to create the impression that it comes from those it is supposed to discredit. Black propaganda contrasts with grey propaganda, which does not identify its source, as well as white propaganda, which does not disguise its origins at all. It is typically used to vilify or embarrass the enemy through misrepresentation. The major characteristic of black propaganda is that the audience are not aware that someone is influencing them, and do not feel that they are being pushed in a certain direction. This type of propaganda is associated with covert psychological operations. Black propaganda is the “big lie”, including all types of creative deceit. Black propaganda relies on the willingness of the receiver to accept the credibility of the source. (Wikipedia, “Black Propaganda”.)

Disinformation

Disinformation is false information deliberately spread to deceive people. It is sometimes confused with misinformation, which is false information but not deliberately so. Disinformation is presented in the form of fake news. Disinformation comes from the application of the Latin prefix dis- to information, to create the meaning “reversal or removal of information”. Disinformation attacks involve the intentional dissemination of false information, with an end goal of misleading, confusing, or manipulating an audience. They may be executed by political, economic or individual actors to influence state or non-state entities and domestic or foreign populations. These attacks are commonly employed to reshape attitudes and beliefs, drive a particular agenda, or elicit certain actions from a target audience. Tactics include the presentation of incorrect or misleading information, the creation of uncertainty, and the undermining of both correct information and the credibility of information sources. (Sources: Wikipedia, “Disinformation” and “Disinformation Attack”.)

Fake news

Fake news
Fake news is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue. Although false news has always been spread throughout history, the term fake news was first used in the 1890s when sensational reports in newspapers were common. Nevertheless, the term does not have a fixed definition and has been applied broadly to any type of false information. It has is also been used by high-profile people to apply to any news unfavourable to them. In some definitions, fake news includes satirical articles misinterpreted as genuine, and articles that employ sensationalist or clickbait headlines that are not supported in the text. Because of this diversity of types of false news, researchers are beginning to favour information disorder as a more neutral and informative term. (Source: Wikipedia, “Fake News”.)

Information Manipulation Theory (IMT)

IMT is a theory of deceptive discourse production, arguing that, rather than communicators producing “truths” and “lies”, the vast majority of everyday deceptive discourse involves complicated combinations of elements that fall somewhere in between these polar opposites; with the most common form of deception being the editing-out of contextually problematic information (i.e., messages commonly known as “white lies”). More specifically, individuals have available to them four different ways of misleading others: playing with the amount of relevant information that is shared, including false information, presenting irrelevant information, and/or presenting information in an overly vague fashion. (Source: Wikipedia, “Information Manipulation Theory”.)

Misinformation

Misinformation is incorrect or misleading information. It differs from disinformation, which is deliberately deceptive. Misinformation comes from the application of the Latin prefix mis- to information, to create the meaning “wrong of false information”. Rumours, by contrast, are information not attributed to any particular source, and so are unreliable and often unverified, but can turn out to be either true or false. Even if later retracted, misinformation can continue to influence actions and memory. People may be more prone to believe misinformation if they are emotionally connected to what they are listening to or are reading (Source: Wikipedia, “Misinformation”.)

Mute news

Mute news is a pernicious form of information in which a key issue that often underlies the concerns of the public and the body politic is obscured from media attention. (Source: Lê Nguyên Hoang and Sacha Altay, “Disinformation: Emergency or False Problem?”, Polytechnique Insights, 6 September 2022.)

Propaganda

Propaganda is communication that is primarily used to influence or persuade an audience to further an agenda, which may not be objective and may be selectively presenting facts to encourage a particular synthesis or perception, or using loaded language to produce an emotional rather than a rational response to the information that is being presented. In the 20th century, propaganda was often associated with a manipulative approach, but historically, propaganda has been a neutral descriptive term of any material that promotes certain opinions or ideologies. (Source: Wikipedia, “Propaganda”.)

Typosquatting

Typosquatting, also called URL hijacking, a sting site or a fake URL, is a form of cybersquatting, and possibly brandjacking, which relies on mistakes such as typos made by Internet users when inputting a website address into a web browser. Should a user accidentally enter an incorrect website address, they may be led to any URL (including an alternative website owned by a cybersquatter). (Source: Wikipedia, “Typosquatting”.)

PODCAST: The Difference between Russian and Ukrainian propaganda, Svitlana Ovcharenko

Research Office. Geneva Graduate Institute.

PODCAST: (Dis)Information as a tool of warfare, with Jean-Marc Rickli

Research office. Geneva Graduate Institute

PODCAST: Mobilization from below to face the control in authoritarian states, with G. O. and V. Neeraj

Research Office. Geneva Graduate Institute.

To Top