Analysis Framework: Psychological Manipulation Tactics

Are there persuasive or misleading tactics being used?

Last updated: July 8, 2022 (Version 1)

What makes a piece of content particularly persuasive or convincing? Generally, it is because of how the information is being presented. Psychologists have identified a number of manipulative techniques designed to persuade an audience to feel or think a certain way. These techniques are effective because they take advantage of the way humans naturally perceive or react to information. In addition, these techniques are even more worrisome when the information is misleading or false. Recognizing and exposing the presence of these manipulative tactics can be a valuable tool.

The creation of this framework was inspired by research on inoculation theory, a social psychological theory that posits one can be protected from persuasive influence in a similar way that a body is protected from disease [30]. The name comes from a medical analogy: immunity to stronger challenges is achieved through pre-exposure to weaker challenges. In the case of persuasion, the “psychological vaccination” is the education of the common manipulative tactics found in misleading narratives. Once armed with the ability to identify these core tactics, one is poised to develop resistance against a future manipulative “attack.”

The Psychological Manipulation Tactics Framework

ARTT’s Psychological Manipulation Tactics conceptual framework categorizes manipulation tactics for easier identification. While not an exhaustive list of manipulative strategies, the framework focuses on the rhetorical or psychological techniques generally observed in mis/disinformation campaigns and activities, and generally supported by recent research.

This framework consists of seven core concepts:

  1. Tapping into our natural bias to find connections by using conspiratorial reasoning
  2. Deliberately encouraging a response using “bait” by intentional trolling
  3. Gaining access to a trusted community by impersonation 
  4. Manufacturing doubt by distorting the scientific consensus
  5. Evoking emotion and encouraging to think with feelings instead of reason
  6. Utilizing polarization to create or expand a gap between two groups
  7. Discrediting the opponent instead of addressing the argument

Each concept includes indicators that signal a unique manipulation tactic. Explanation of each concept and indicator can be found below.

Conspiratorial reasoning is a way of thinking that provides a frame of interpretation for events. Utilizing this type of reasoning can be an effective manipulation tool. Conspiratorial reasoning exploits one’s bias toward making causal connections between unrelated events, and the inclination to attach melodramatic narratives as explanations for those perceived connections [15, 32, 42].

Indicator Definition Citation
Reference to concealment When the messenger purports to reveal a lie, and/or frames themselves as a truth-seeker 25, 29
Encouragement of pervasive skepticism States or alludes that widely believed facts, well-supported by available evidence, should be regarded as suspect, especially when those facts are used to evaluate an explanation 3, 26, 29, 31
Utilizing Manichean terms Distills a complex or nuanced situation into a stark narrative of good versus evil 3, 32
Insistence on causality Implies a causal connection of events, despite a causal connection being unconfirmed or implausible 3, 14
Accusation of nefarious actors References the (often secret) plotting of powerful, malevolent groups with nefarious intents that go against the public interest. 3, 26

The act of “trolling” is when an online user deliberately “baits” a response by using inflammatory, irrelevant, or otherwise disruptive language. Online trolling has a variety of manifestations that are not always deliberately malign in nature. Intentional trolling, as opposed to other types of humorous or non-intentional trolling, is serious and ideologically motivated. Intentional trolling tactics can be manipulative because they take advantage of emotional reactions, heuristics (mental shortcuts) or identity cues [42].

Indicator Definition Citation
Provocation trolling Also known as “outrage” trolling, with a goal to inflame, upset, or trigger emotion and response. 5, 19, 36

Repetitive trolling
Consistently repeating the same message, which can increase the likelihood that a statement will be perceived as true. 18, 36
Discord trolling Goal is to foster social division or public discontent. This is often done by amplifying the reductive social interpretations that confirm existing beliefs, support desired conclusions, or prompt certain feelings regarding groups of people and events. 2, 7, 13, 38, 43, 44

Impersonation involves emulating the style or behavior of an individual or organization in order to gain access to a trusted community. This tactic takes advantage of the inherent trust individuals already have with a familiar identity, community or source [42].

Indicator Definition Citation
Identity impersonation Impersonation of an individual or organization, such as posing as genuine online user, or appropriating the branding, campaign, or images of a legitimate organization 2, 4, 7, 20

Support impersonation
Sometimes referred to as “astroturfing,” this strategy creates the false perception of grassroot community support for a certain cause; impersonation of support 44
Source impersonation Posing as a legitimate news website or blog without the usual journalistic norms and credentials; impersonation of legitimate source 35, 42

Making sense of scientific information is often complicated for those outside the scientific community. People commonly either utilize simple heuristics (mental shortcuts) or rely on experts to interpret scientific uncertainty or distill other complex information. Malicious actors can take advantage of these tendencies to intentionally distort the public perception of scientific topics [12, 21].

Indicator Definition Citation
Fake experts The use of a non-expert (unqualified person or institution, or someone who does not possess relevant expertise) as a source of credible information to cast doubt on expert agreement 12, 31, 15

False balance
Giving contrarian or unsupported views equal voice with expert views, in order to give the impression of a legitimate balance of opinions or debate 12, 17
Skewing the science The intentional mischaracterization of evidence 15, 27, 29
Exaggeration of risk Occurs when risks and benefits are presented without a proper sense of proportion 25, 26, 37

Emotion is a potent force that can influence an opinion or urge people to act. When utilizing content or rhetoric that does not add any informative value, but instead deliberately evokes an emotional reaction, one can exploit the human tendency to think and react with emotion instead of reason [42].

Indicator Definition Citation
Negative sentiments A focus on generating negative, rather than neutral or positive, feelings 28, 45

Moral-emotional language
Moral emotions create a connection between a person and society, and appeal to one’s sense of right and wrong. Moral-emotional language are words that describe a reaction to the social behavior of others. 6, 23, 24, 28, 39
Fearmongering (fear appeal) The action of deliberately arousing public fear or alarm about a particular issue 8, 37, 40
Appeal to protective duty Language that implies the most vulnerable among us are in peril, and positions the audience as capable of, or even obligated to, participate in the fight for their justice 8, 25

Polarization creates or expands gaps between two groups. Particularly effective at increasing in-group favoritism or discouraging empathy toward out-groups, polarization tactics exploit the tendency for people to self-categorize into groups, and to see the world through binary distinctions (e.g., us and them) [42, 38].

Indicator Definition Citation
False amplification Intentionally exaggerating existing grievances between groups 8, 22

Othering
Occurs when one group of people (usually a majority group or an in-group) treats another group of people (often a marginalized group or an out-group) as though they are dangerous, alien, or to blame for a certain problem 16, 34
Group identity language Distinct reference to a specific identity that distinguishes one’s membership in a group and highlights shared values (generally, one’s religious, political, ethnic or moral identity). 2, 33, 43

Discrediting is a tactic that focuses on dismantling the public credibility of one’s opponents, rather than addressing any valid claims or accusations that the opponent levies. The act of discrediting exploits how a person’s credibility hinges on trustworthiness and competence [1, 42].

Indicator Definition Citation
Ad hominem attack Deflecting attention away from an accusation or argument by attacking the source of the criticism instead 9, 41

Attributing false motives
Ascribing false or ulterior motives to opponents 9, 27
Denial Denying that a problem exists, or refusing to respond or answer to criticism 10, 35

Methodology and References

The ARTT team created the Psychological Manipulation Tactics Framework through a literature review in the fields of social and cognitive psychology, communication theory, sociology, and information research. Six concepts of manipulation, identified by Sander van der Linden and Jon Roozenbeek’s research, served as the framework’s foundation: impersonation, conspiracy, emotion, polarization, discrediting, and trolling [42]. From there, the team added an additional concept (manufacturing doubt), and amplified each concept with bespoke indicators.

After a review by ARTT project advisors, this framework is being released as an alpha version. The ARTT team has plans to iterate on this version in the next phase of our project. Inspired by the climate misinformation-based FLICC taxonomy [11], a future iteration could include an expansion to topic-specific manipulation techniques. If you have any feedback on this framework, please send an email to artt [dot] hackshackers [dot] com with the subject line “Manipulation Framework.”

We would also like to thank Hansika Kapoor and John Cook for their contributions to the development of this framework.

[1] A’Beckett, Ludmilla. 2013. “Strategies to Discredit Opponents: Russian Presentations of Events in Countries of the Former Soviet Union.” Psychology of Language and Communication 17 (2): 133–56. https://doi.org/10.2478/plc-2013-0009.

[2] Arif, Ahmer, Leo Graiden Stewart, and Kate Starbird. 2018. “Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction 2 (CSCW): 1–27. https://doi.org/10.1145/3274289.

[3] Baden, Christian, and Tzlil Sharon. 2021. “BLINDED BY THE LIES? Toward an Integrated Definition of Conspiracy Theories.” Communication Theory 31 (1): 82–106. https://doi.org/10.1093/ct/qtaa023.

[4] Benton, Bond, and Daniela Peterka-Benton. 2020. “Hating in Plain Sight: The Hatejacking of Brands by Extremist Groups.” Public Relations Inquiry 9 (1): 7–26. https://doi.org/10.1177/2046147X19863838.

[5] Berghel, Hal, and Daniel Berleant. 2018. “The Online Trolling Ecosystem.” Computer 51 (8): 44–51. https://doi.org/10.1109/MC.2018.3191256.

[6] Brady, William J., Julian A. Wills, John T. Jost, Joshua A. Tucker, and Jay J. Van Bavel. 2017. “Emotion Shapes the Diffusion of Moralized Content in Social Networks.” Proceedings of the National Academy of Sciences 114 (28): 7313–18. https://doi.org/10.1073/pnas.1618923114.

[7] Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84. https://doi.org/10.2105/AJPH.2018.304567.

[8] Buntain, Cody, Monique Deal Barlow, Mia Bloom, and Mila A. Johns. 2022. “Paved with Bad Intentions: QAnon’s Save the Children Campaign.” Journal of Online Trust and Safety 1 (2). https://doi.org/10.54501/jots.v1i2.51.

[9] Campos-Castillo, Celeste, and Stef M. Shuster. 2021. “So What If They’re Lying to Us? Comparing Rhetorical Strategies for Discrediting Sources of Disinformation and Misinformation Using an Affect-Based Credibility Rating.” American Behavioral Scientist, December, 000276422110660. https://doi.org/10.1177/00027642211066058.

[10] Coan, Travis G., Constantine Boussalis, John Cook, and Mirjam O. Nanko. 2021. “Computer-Assisted Classification of Contrarian Claims about Climate Change.” Scientific Reports 11 (1): 22320. https://doi.org/10.1038/s41598-021-01714-4.

​​[11] Cook, John. 2020. “Deconstructing Climate Science Denial.” In Research Handbook on Communicating Climate Change, by David Holmes and Lucy Richardson, 62–78. Edward Elgar Publishing. https://doi.org/10.4337/9781789900408.00014.

[12] Cook, John, Stephan Lewandowsky, and Ullrich K. H. Ecker. 2017. “Neutralizing Misinformation through Inoculation: Exposing Misleading Argumentation Techniques Reduces Their Influence.” PLOS ONE 12 (5): e0175799. https://doi.org/10.1371/journal.pone.0175799.

[13] Cruz, Angela Gracia B., Yuri Seo, and Mathew Rex. 2018. “Trolling in Online Communities: A Practice-Based Theoretical Perspective.” The Information Society 34 (1): 15–26. https://doi.org/10.1080/01972243.2017.1391909.

[14] Douglas, Karen M., Robbie M. Sutton, and Aleksandra Cichocka. 2017. The psychology of conspiracy theories. Current directions in psychological science, 26(6), 538-542.

[15] Diethelm, Pascal and McKee, M. 2009. Denialism: what is it and how should scientists respond?. The European Journal of Public Health, 19(1), 2-4.

[16] Dionne, Kim Yi, and Fulya Felicity Turkmen. 2020. “The Politics of Pandemic Othering: Putting COVID-19 in Global and Historical Context.” International Organization 74 (S1): E213–30. https://doi.org/10.1017/S0020818320000405.

[17] Dixon, Graham N., and Christopher E. Clarke. 2013. “Heightening Uncertainty Around Certain Science: Media Coverage, False Balance, and the Autism-Vaccine Controversy.” Science Communication 35 (3): 358–82. https://doi.org/10.1177/1075547012458290.

[18] Fazio, L.K., Rand, D.G. & Pennycook, G. Repetition increases perceived truth equally for plausible and implausible statements. Psychon Bull Rev 26, 1705–1710 (2019). https://doi.org/10.3758/s13423-019-01651-4

[19] Fichman, Pnina, and Matthew Vaughn. 2021. “The Relationships between Misinformation and Outrage Trolling Tactics on Two Yahoo! Answers Categories.” Journal of the Association for Information Science and Technology 72 (12): 1498–1510. https://doi.org/10.1002/asi.24497.

[20] Goga, Oana, Giridhari Venkatadri, and Krishna P. Gummadi. 2015. “The Doppelgänger Bot Attack: Exploring Identity Impersonation in Online Social Networks.” In Proceedings of the 2015 Internet Measurement Conference, 141–53. Tokyo Japan: ACM. https://doi.org/10.1145/2815675.2815699.

[21] Goldberg, Rebecca F., and Laura N. Vandenberg. 2021. “The Science of Spin: Targeted Strategies to Manufacture Doubt with Detrimental Effects on Environmental and Public Health.” Environmental Health 20 (1): 33. https://doi.org/10.1186/s12940-021-00723-0.

[22] Groenendyk, Eric. 2018. “Competing Motives in a Polarized Electorate: Political Responsiveness, Identity Defensiveness, and the Rise of Partisan Antipathy.” Political Psychology 39 (S1): 159–71. https://doi.org/10.1111/pops.12481.

[23] Haidt, J. 2003. “The Moral Emotions.” In Handbook of Affective Sciences. Oxford University Press. https://people.stern.nyu.edu/jhaidt/articles/alternate_versions/haidt.2003.the-moral-emotions.pub025-as-html.html.

[24] Han, Jiyoung, Meeyoung Cha, and Wonjae Lee. 2020. “Anger Contributes to the Spread of COVID-19 Misinformation.” Harvard Kennedy School Misinformation Review 1 (3). https://doi.org/10.37016/mr-2020-39.

[25] Hughes, Brian, Cynthia Miller-Idriss, Rachael Piltch-Loeb, Beth Goldberg, Kesa White, Meili Criezis, and Elena Savoia. 2021. “Development of a Codebook of Online Anti-Vaccination Rhetoric to Manage COVID-19 Vaccine Misinformation.” International Journal of Environmental Research and Public Health 18 (14): 7556. https://doi.org/10.3390/ijerph18147556.

[26] Jamison, Amelia, David A. Broniatowski, Michael C. Smith, Kajal S. Parikh, Adeena Malik, Mark Dredze, and Sandra C. Quinn. 2020. “Adapting and Extending a Typology to Identify Vaccine Misinformation on Twitter.” American Journal of Public Health 110 (S3): S331–39. https://doi.org/10.2105/AJPH.2020.305940.

[27] Kata, Anna. 2012. “Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm – An Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement.” Vaccine, Special Issue: The Role of Internet Use in Vaccination Decisions, 30 (25): 3778–89. https://doi.org/10.1016/j.vaccine.2011.11.112.

[28] Martel, Cameron, Gordon Pennycook, and David G. Rand. 2020. “Reliance on Emotion Promotes Belief in Fake News.” Cognitive Research: Principles and Implications 5 (1): 47. https://doi.org/10.1186/s41235-020-00252-3.

[29] Massey, Philip M, Matthew D Kearney, Michael K Hauer, Preethi Selvan, Emmanuel Koku, and Amy E Leader. 2020. “Dimensions of Misinformation About the HPV Vaccine on Instagram: Content and Network Analysis of Social Media Characteristics.” Journal of Medical Internet Research 22 (12): e21451. https://doi.org/10.2196/21451.

[30] McGuire, William J. 1964. “Some Contemporary Approaches.” In Advances in Experimental Social Psychology, edited by Leonard Berkowitz, 1:191–229. Academic Press. https://doi.org/10.1016/S0065-2601(08)60052-0.

[31] Moran, Meghan Bridgid, Melissa Lucas, Kristen Everhart, Ashley Morgan, and Erin Prickett. 2016. “What Makes Anti-Vaccine Websites Persuasive? A Content Analysis of Techniques Used by Anti-Vaccine Websites to Engender Anti-Vaccine Sentiment.” Journal of Communication in Healthcare 9 (3): 151–63. https://doi.org/10.1080/17538068.2016.1235531.

[32] Oliver, J. Eric, and Thomas J. Wood. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66. https://doi.org/10.1111/ajps.12084.

[33] Osmundsen, Mathias, Alexander Bor, Peter Bjerregaard Vahlstrup, Anja Bechmann, and Michael Bang Petersen. 2021. “Partisan Polarization Is the Primary Psychological Motivation behind Political Fake News Sharing on Twitter.” American Political Science Review 115 (3): 999–1015. https://doi.org/10.1017/S0003055421000290.

[34] Rathje, Steve, Jay J. Van Bavel, and Sander van der Linden. 2021. “Out-Group Animosity Drives Engagement on Social Media.” Proceedings of the National Academy of Sciences 118 (26): e2024292118. https://doi.org/10.1073/pnas.2024292118.

[35] Roozenbeek, Jon, and Sander van der Linden. 2019. “Fake News Game Confers Psychological Resistance against Online Misinformation.” Palgrave Communications 5 (1): 1–10. https://doi.org/10.1057/s41599-019-0279-9.

[36] Sanfilippo, Madelyn R., Pnina Fichman, and Shengnan Yang. 2018. “Multidimensionality of Online Trolling Behaviors.” The Information Society 34 (1): 27–39. https://doi.org/10.1080/01972243.2017.1391911.

[37] Scannell, Denise, Linda Desens, Marie Guadagno, Yolande Tra, Emily Acker, Kate Sheridan, Margo Rosner, Jennifer Mathieu, and Mike Fulk. 2021. “COVID-19 Vaccine Discourse on Twitter: A Content Analysis of Persuasion Techniques, Sentiment and Mis/Disinformation.” Journal of Health Communication 26 (7): 443–59. https://doi.org/10.1080/10810730.2021.1955050.

[38] Simchon, Almog, William J Brady, and Jay J Van Bavel. 2022. “Troll and Divide: The Language of Online Polarization.” PNAS Nexus 1 (1): pgac019. https://doi.org/10.1093/pnasnexus/pgac019.

[39] Solovev, Kirill, and Nicolas Pröllochs. 2022. “Moral Emotions Shape the Virality of COVID-19 Misinformation on Social Media.” ArXiv:2202.03590 [Cs], February. http://arxiv.org/abs/2202.03590.

[40] Tannenbaum, Melanie B., Justin Hepler, Rick S. Zimmerman, Lindsey Saul, Samantha Jacobs, Kristina Wilson, and Dolores Albarracin. 2015. “Appealing to Fear: A Meta-Analysis of Fear Appeal Effectiveness and Theories.” Psychological Bulletin 141 (6): 1178–1204. https://doi.org/10.1037/a0039729.

[41] Van der Linden, Sander, Costas Panagopoulos, and Jon Roozenbeek. 2020. “You Are Fake News: Political Bias in Perceptions of Fake News.” Media, Culture & Society 42 (3): 460–70. https://doi.org/10.1177/0163443720906992.

[42] Van der Linden, Sander, and Jon Roozenbeek. 2020. “Psychological Inoculation Against Fake News.” In The Psychology of Fake News, 147–69. Routledge. https://doi.org/10.4324/9780429295379-11.

[43] Walter, Dror, Yotam Ophir, and Kathleen Hall Jamieson. 2020. “Russian Twitter Accounts and the Partisan Polarization of Vaccine Discourse, 2015–2017.” American Journal of Public Health 110 (5): 718–24. https://doi.org/10.2105/AJPH.2019.305564.

[44] Zerback, Thomas, Florian Töpfl, and Maria Knöpfle. 2021. “The Disconcerting Potential of Online Disinformation: Persuasive Effects of Astroturfing Comments and Three Strategies for Inoculation against Them.” New Media & Society 23 (5): 1080–98. https://doi.org/10.1177/1461444820908530.

[45] Zollo, Fabiana, Petra Kralj Novak, Michela Del Vicario, Alessandro Bessi, Igor Mozetič, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi. 2015. “Emotional Dynamics in the Age of Misinformation.” PLOS ONE 10 (9): e0138740. https://doi.org/10.1371/journal.pone.0138740.