Computational Propaganda
   HOME

TheInfoList



OR:

Computational propaganda is the use of computational tools (algorithms and automation) to distribute misleading information using
social media Social media are interactive technologies that facilitate the Content creation, creation, information exchange, sharing and news aggregator, aggregation of Content (media), content (such as ideas, interests, and other forms of expression) amongs ...
networks. The advances in digital technologies and social media resulted in enhancement in methods of
propaganda Propaganda is communication that is primarily used to influence or persuade an audience to further an agenda, which may not be objective and may be selectively presenting facts to encourage a particular synthesis or perception, or using loaded l ...
. It is characterized by automation, scalability, and anonymity. Autonomous agents (
internet bot An Internet bot, web robot, robot, or simply bot, is a software application that runs automated tasks ( scripts) on the Internet, usually with the intent to imitate human activity, such as messaging, on a large scale. An Internet bot plays the ...
s) can analyze
big data Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data processing, data-processing application software, software. Data with many entries (rows) offer greater statistical power, while data with ...
collected from social media and
Internet of things Internet of things (IoT) describes devices with sensors, processing ability, software and other technologies that connect and exchange data with other devices and systems over the Internet or other communication networks. The IoT encompasse ...
in order to ensure manipulating
public opinion Public opinion, or popular opinion, is the collective opinion on a specific topic or voting intention relevant to society. It is the people's views on matters affecting them. In the 21st century, public opinion is widely thought to be heavily ...
in a targeted way, and what is more, to mimic real people in the social media. Coordination is an important component that bots help achieve, giving it an amplified reach. Digital technology enhance well-established traditional methods of manipulation with public opinion: appeals to people's emotions and biases circumvent
rational thinking Rationality is the quality of being guided by or based on reason. In this regard, a person acts rationally if they have a good reason for what they do, or a belief is rational if it is based on strong evidence. This quality can apply to an ab ...
and promote specific ideas. A pioneering work in identifying and analyzing of the concept has been done by the team of Philip N. Howard at the
Oxford Internet Institute The Oxford Internet Institute (OII) serves as a hub for interdisciplinary research, combining social and computer science to explore information, communication, and technology. It is an integral part of the University of Oxford's Social Science ...
who since 2012 have been investigating computational propaganda, following earlier Howard's research of the effects of social media on general public, published, e.g., in his 2005 book ''New Media Campaigns and the Managed Citizen'' and earlier articles. In 2017, they published a series of articles detailing computational propaganda's presence in several countries. Regulatory efforts have proposed tackling computational propaganda tactics using multiple approaches. Detection techniques are another front considered towards mitigation; these can involve machine learning models, with early techniques having issues such as a lack of datasets or failing against the gradual improvement of accounts. Newer techniques to address these aspects use other machine learning techniques or specialized algorithms, yet other challenges remain such as increasingly believable text and its automation.


Mechanisms

Computational propaganda is the strategic posting on social media of misleading information by fake accounts that are automated to a degree in order to manipulate readers.


Bots and coordination

In social media,
bots The British Overseas Territories (BOTs) or alternatively referred to as the United Kingdom Overseas Territories (UKOTs) are the fourteen dependent territory, territories with a constitutional and historical link with the United Kingdom that, ...
are accounts pretending to be human. They are managed to a degree via programs, and are used to spread information that leads to mistaken impressions. In social media, they may be referred to as “
social bot A social bot, also described as a social AI or social algorithm, is a software agent that communicates autonomously on social media. The messages (e.g. tweets) it distributes can be simple and operate in groups and various configurations with ...
s”, and may be helped by popular users that amplify them and make them seem reliable through sharing their content. Bots allow propagandists to keep their identities secret. One study from Oxford's Computational Propaganda Research Project indeed found that bots achieved effective placement in Twitter during a political event. Bots can be coordinated, which may be leveraged to make use of algorithms. Propagandists mix real and fake users; their efforts make use of a variety of actors, including botnets, online paid users, astroturfers, seminar users, and troll armies. Bots can provide a fake sense of prevalence. Bots can also engage in spam and harassment. They are progressively becoming sophisticated, one reason being the improvement of AI. Such development complicates detection for humans and automatized methods alike.


Problematic information

The problematic content tactics propagandists employ include
disinformation Disinformation is misleading content deliberately spread to deceive people, or to secure economic or political gain and which may cause public harm. Disinformation is an orchestrated adversarial activity in which actors employ strategic dece ...
,
misinformation Misinformation is incorrect or misleading information. Misinformation and disinformation are not interchangeable terms: misinformation can exist with or without specific malicious intent, whereas disinformation is distinct in that the information ...
, and information shared regardless of veracity. The spread of fake and misleading information seeks to influence public opinion.
Deepfake ''Deepfakes'' (a portmanteau of and ) are images, videos, or audio that have been edited or generated using artificial intelligence, AI-based tools or AV editing software. They may depict real or fictional people and are considered a form of ...
s and generative language models are also employed, creating convincing content. The proportion of misleading information is expected to grow, complicating detection.


Algorithmic influence

Algorithms are another important element to computational propaganda.
Algorithmic curation Algorithmic curation is the selection of online media by recommendation algorithms and personalized searches. Examples include search engine and social media products such as the Twitter feed, Facebook's News Feed, and the Google Personalized ...
may influence beliefs through repetition. Algorithms boost and hide content, which propagandists use to their favor. Social media algorithms prioritize user engagement, and to that end their filtering prefers controversy and
sensationalism In journalism and mass media, sensationalism is a type of editorial tactic. Events and topics in news stories are selected and worded to excite the greatest number of readers and viewers. This style of news reporting encourages biased or emoti ...
. The algorithmic selection of what is presented can create echo chambers and assert influence. One study poses that
TikTok TikTok, known in mainland China and Hong Kong as Douyin (), is a social media and Short-form content, short-form online video platform owned by Chinese Internet company ByteDance. It hosts user-submitted videos, which may range in duration f ...
’s automated (e.g. the sound page) and interactive (e.g. stitching, duetting, and the content imitation trend) features can also boost misleading information. Furthermore, anonymity is kept through deleting the audio's origin.


Multidisciplinary studies

A multidisciplinary approach has been proposed towards combating misinformation, proposing the use of psychology to understand its effectiveness. Some studies have looked at misleading information through the lens of cognitive processes, seeking insight into how humans come to accept it. Media theories can help understand the complexity of relationships present in computational propaganda and surrounding actors, its effect, and to guide regulation efforts.
Agenda-setting theory Agenda-setting theory suggests that the Media (communication), communications media, through their ability to identify and publicize issues, play a pivotal role in shaping the problems that attract attention from governments and international organ ...
and framing theory have also been considered for analysis of computational propaganda phenomena, finding these effects present; algorithmic amplification is an instance of the former, which states media's selection and occlusion of topics influences the public's attention. It also states that repetition focuses said attention. Repetition is a key characteristic of computational propaganda; in social media it can modify beliefs. One study posits that repetition makes topics fresh on the mind, having a similar effect on perceived significance. The
Illusory Truth Effect The illusory truth effect (also known as the illusion of truth effect, validity effect, truth effect, or the reiteration effect) is the tendency to believe false information to be correct after repeated exposure. This phenomenon was first identi ...
, which states people will believe what is repeated to them over time, has also been suggested to bring into light that computational propaganda may be doing the same. Other phenomena have been proposed to be at play in Computational Propaganda tools. One study posits the presence of the megaphone effect, the
bandwagon effect The bandwagon effect is a psychological phenomenon where people adopt certain behaviors, styles, or attitudes simply because others are doing so. More specifically, it is a cognitive bias by which public opinion or behaviours can alter due to ...
, and cascades. Other studies point to the use of content that evokes emotions. Another tactic used is suggesting connection between topics by placing them in the same sentence. Incidence of Trust bias, Validation By Intuition Rather Than Evidence,
Truth Bias Truth-default theory (TDT) is a communication theory which predicts and explains the use of veracity and deception detection in humans. It was developed upon the discovery of the veracity effect - whereby the proportion of truths versus lies present ...
,
Confirmation Bias Confirmation bias (also confirmatory bias, myside bias, or congeniality bias) is the tendency to search for, interpret, favor and recall information in a way that confirms or supports one's prior beliefs or Value (ethics and social sciences), val ...
, and
Cognitive Dissonance In the field of psychology, cognitive dissonance is described as a mental phenomenon in which people unknowingly hold fundamentally conflicting cognitions. Being confronted by situations that challenge this dissonance may ultimately result in some ...
are present as well. Another study points to the occurrence of
Negativity Bias The negativity bias,Kanouse, D. E., & Hanson, L. (1972). Negativity in evaluations. In E. E. Jones, D. E. Kanouse, S. Valins, H. H. Kelley, R. E. Nisbett, & B. Weiner (Eds.), ''Attribution: Perceiving the causes of behavior.'' Morristown, NJ: Ge ...
and Novelty Bias.


Spread

Bots are used by both private and public parties and have been observed in politics and crises. Its presence has been studied across many countries, with incidence in more than 80 countries. Some studies have found bots to be effective. though another found limited impact. Similarly, algorithmic manipulation has been found to have an effect.


Regulation

Some studies propose a strategy that incorporates multiple approaches towards regulation of the tools used in computational propaganda. Controlling misinformation and its usage in politics through legislation and guidelines; having platforms combat fake accounts and misleading information; and devising psychology-based intervention tactics are some of the possible measures.
Information Literacy The Association of College and Research Libraries defines information literacy as a "set of integrated abilities encompassing the reflective discovery of information, the understanding of how information is produced and valued and the use of infor ...
has also been proposed as an affront to these tools. However, it has also been reported that some of these approaches have their faults. In Germany, for example, legislation efforts have encountered problems and opposition. In the case of social media, self-regulation is hard to request. These platforms’ measures also may not be enough and put the power of decision on them. Information literacy has its limits as well.


Detection

Computational propaganda detection can focus either on content or on accounts.


Detecting propaganda content

Two ways to detect propaganda content include analyzing the text through various means, called “Text Analysis”, and tackling detecting coordination of users, called “Social Network Analysis”. Early techniques to detect coordination involved mostly supervised models such as decision trees, random forests, SVMs and neural networks. These just analyze accounts one by one without modeling coordination. Advanced bots and the difficulty in finding or creating datasets have hindered these detection methods. Modern detection techniques’ strategies include making the model study a large group of accounts considering coordination; creating specialized algorithms for it; and building unsupervised and semi-supervised models.


Detecting propaganda accounts

Detecting accounts has a variety of approaches: they either seek to find the author of a piece, use statistical methods, analyze a mix of both text and data beyond it such as account characteristics, or scan user activity tendencies. This second focus also has a Social Network Analysis approach, with a technique that looks at time elements on campaigns alongside features of detected groups.


Limitations

Detection techniques are not without their issues. One of them is that actors evolve their coordination techniques and can operate in the time it takes for detection methods to be created, requiring real-time approaches. Other challenges detection faces are that techniques have yet to adapt to different media formats, should integrate explainability, could inform the how and why of a propagandistic document or user, and may face increasingly difficult to detect content and may further be automatized. It is also presented with a lack of datasets, and creating them can involve sensitive user data that requires extensive work to protect them.


References


Further reading

*2018: Samuel C. Woolley and Philip N. Howard (eds.) ''Computational Propaganda. Political Parties, Politicians, and Political Manipulation on Social Media'' *2020: Howard, Philip N., ''Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives'' {{Propaganda Propaganda techniques using information Internet manipulation and propaganda