Enable breadcrumbs token at /includes/pageheader.html.twig

Digital Deceptions: The War Over Truth in the Age of Disinformation

Exploring the impact of disinformation in modern conflicts reveals the urgent need for strategies to counteract digital deceit.

 

Disinformation has existed since information has had the power to change the course of events. History holds many examples of opinions and decisions informed by the spread of false content.

 

To qualify as disinformation, the content must be deliberate and malicious, according to the United Nations High Commissioner for Refugees (UNHCR), a U.N. agency.

 

Disinformation is often punctuated by content such as hoaxes, spear phishing and propaganda, spreading fear and suspicion among the population.

 

 

Russia’s invasion of Ukraine started also started a war over facts, with the Kremlin fighting to install a series of narratives at odds with the truth while seeking to turn open societies against themselves.

 

“Systematic information manipulation and disinformation by the Kremlin is applied as an operational tool in its assault on Ukraine,” said Josep Borrell, high representative for foreign affairs and security policy at the European Union, shortly after the attack began.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

European authorities limited the distribution of Kremlin-funded outlets like Russia Today and Kremlin with a directive.

Still, social media provides an alternative set of tools.

“The Kremlin showed the world that the online information space is a realm perfectly fit for asymmetric warfare, in which one side does not have the ability to control the information flow and must find other ways to tilt narratives in its favor,” said Bethany Allen-Ebrahimian in Foreign Policy, a political journal.

And the use of disinformation warfare has been shared among dictatorial allies.

“Within a matter of just a few years, Beijing has copied and successfully used many of Russia’s information warfare techniques,” Allen-Ebrahimian wrote.

A research paper published by the RAND Corporation, a Washington, D.C.-based think tank, has taken this one step further with the advent of large language models.

Malicious actors can create avatars that communicate credibly as natives from different subregions of countries, creating the idea that in a certain area, such as a U.S. county, public opinion tilts in a particular direction.

 

 

 

 

 

 

Image
Josep Borrell, European Union
Systematic information manipulation and disinformation by the Kremlin is applied as an operational tool in its assault on Ukraine.
Josep Borrell
European Union foreign affairs and security policy representative

 

This type of disinformation is called astroturfing, and according to the UNHCR, this masks “the sponsors of a message (e.g., political, religious, advertising or PR organizations) to make it appear as though it comes from grassroots participants. The practice aims to give organizations credibility by withholding information about their motives or financial connections.”

This technique can be leveraged to abuse the trust of the citizenry in open societies and manipulate public opinion. Novel artificial intelligence tools can facilitate this process.

“The risk is that next-generation astroturfing could pose a direct challenge to democratic societies if malign actors are able to covertly shape users’ shared understanding of the domestic political conversation and thus subvert the democratic process,” said the paper’s authors William Marcellino, Nathan Beauchamp-Mustafaga, Amanda Kerrigan, Lev Navarre Chao and Jackson Smith.

In this recently published work, the authors call on the U.S. government to look beyond the threats and urgently address the issue.

“We strongly suggest the development of a coherent, proactive, and broad strategy for dealing with this new threat,” the authors conclude.