Last updated: 14 July 2022
When faced with a statement or conversation in which misinformation is being shared, or the topic is contested and not well understood, it can be difficult to know how to engage and respond. What does the research say?
In fact, a world of possibilities exist. Here are some options for conversational responses that reflect the latest research, are effective, and build trust that our team is actively exploring in the ARTT Research Catalog.
When you are engaged in conversation around a delicate topic, it is not always easy to know the exact thing to say, especially when the conversation is happening online. Effective and appropriate conversational responses can vary, not only because of the topic being discussed, but also because of the context of the conversation itself – for example, one’s relationship to the speaker, or the platform of discussion – as well as the conversational goal. While one goal in an exchange around misinformation might be providing correct information, the ability for online participants to believe that information is correct may depend on their trust in the speaker or the speaker’s sources.
The ARTT team has defined a number of response modes, distilled from our active exploration of academic and practitioner research as found in our catalog. The responses we have identified complete the following sentence:
“In this conversation, I could…”
You can also find definitions of some specific methods related to the response modes that we’ve identified; for the exhaustive list of methods, please refer to the most recent version of the ARTT Catalog.
When using one of these modes in an online discussion, it’s good to remember that the responses are sourced from different disciplines and theories, which makes it hard to compare one mode against another. Response modes are also tested in specific contexts, meaning how they can be applied may be specific and limited. By providing response options as well as the research they are connected to, we aim to provide people in conversations with more information to craft their own responses.
To correct someone in a discussion is to “show or tell someone that something is wrong and to make it right” (Cambridge Dictionary). The goals for this approach can vary. One goal may be to correct the speaker about a specific issue such as climate change or vaccination. Another goal might be to equip the speaker with general skills to identify inaccurate information. There might also be times when you want to make sure that others listening in on the conversation have access to correct facts.
Related Concepts. One issue with a correction approach is the possibility of “the backfire effect”: that something about the exchange causes listeners to double-down on their misbeliefs or misunderstandings. The prevalence of the backfire effect, and how it occurs, is contested among researchers and not as common as once thought . Regardless, researchers recommend framing corrections to align with the misinformation sharer’s worldview and citing sources that are associated with this worldview.
Correction is associated with various methods and applications in research:
Post Exposure Correction
Post Exposure Correction refers to correcting rumors or misinformation after a person has been exposed to the falsehood. This category also includes fact-checking .
Preemptive Refutation or “Prebunking”
Preemptive Refutation is one method of ‘inoculating’ individuals against potential misinformation. Inoculation theory was first proposed in 1964 and posits that individuals can be inoculated against future persuasive attacks to change attitude in much the same way that individuals can be inoculated against future viral attacks  .
In Preemptive Refutation, one or more examples of weakened misinformation are presented and directly refuted. Attitudinal resistance is conferred by pre-emptively highlighting false claims and refuting potential counterarguments before the person is exposed to the falsehood . This is also called “refutational pre-emption” or “prebunking.”
Note: Under inoculation theory, a ‘warning of threat’ method is also possible, which forewarns people that they may be exposed to information that challenges their existing beliefs or behaviors . This however, is not included as a separate method in our catalog as it has not been tested as a distinct method, but always alongside Preemptive Refutation.
Myth and Fact Story
This describes a specific type of article where different claims circulating about a certain issue are explained as either false (“myths”) or true (“facts”) . It is a form of Post Exposure Correction.
Consensus is additional context that can be provided with a correction. When consensus exists, it may increase the efficacy of correction. To convey consensus is to communicate the high level of normative agreement (“consensus”) among experts about a specific fact .
Also known as an alternative explanation, ‘Causal Correction’ is a specific method of correction in which a causal explanation is provided for an unexplained event; or an alternative reasoning is provided for a phenomenon. One study finds that a ‘Causal Correction’ is significantly more effective than a denial, even when the denial is backed by unusually strong evidence .
This is a method that has been observed in practical interventions in which someone offers to undergo source evaluation and fact-checking processes in tandem with someone else. Co-verification has yet to be tested in research studies, as far as we know . It is likely to co-occur with other ARTT categories, specifically ‘Correct’ and ‘Encourage Healthy Skepticism.’
De-escalation is a reduction of hostilities between different individuals or groups. This is an overarching goal of efforts in conflict resolution or transformation (for a definition of conflict transformation, see ). Concretely, a number of methods such as using humor, reminding the other party about shared values (see Encouraging Norms) can help in de-escalating a conflict. This ARTT tag is likely to co-occur with other ARTT tags. In our catalog, interventions to reduce affective polarization are also associated with this tag.
Having empathy is pretty complicated to define, but it is an identification with someone else on an emotional level. According to one definition, “empathy is the ability to recognize, understand, and share the thoughts and feelings of another person, animal, or fictional character” (Psychology Today). Empathizing is a key mode of responding in conflictual exchanges where resolution or a transformation of the relationship is the goal.
Different researchers define empathy differently and distinguish between compassion, distress (being overwhelmed by the suffering of others) and empathic concern . In our catalog, these have currently all been grouped under the Empathize ARTT tag. We are currently reviewing whether these might be separated as separate response modes in future versions. For an introduction to empathy and related concepts of compassion, see .
Related concepts. One issue that has been identified is the potential for compassion fatigue, whereby people over-empathize and cause unhealthy amounts of distress for themselves.
Some methods highlighted under this response mode are:
“This is a person-centered communication style used to enhance internal motivation for attitudinal change by exploring and solving inherent ambivalences” .
Outparty Contact (also called ‘intergroup contact’)
Outparty contact is interaction with people on the opposite side of a conflict. The groups in the conflict could be defined based on political beliefs, religious or national identity. The contact could be online, imagined or indirect – i.e., interaction with a member of the same group but with friends in the other group .
Encourage Healthy Skepticism
To encourage healthy skepticism is to help others ask questions of the information they are reading, such as “What do other sources say?” or “What’s the evidence?” Being able to critically evaluate information by not immediately believing new claims is important with an attitude of healthy skepticism. This response mode is not the same as being skeptical of all information, but rather encompasses a wide set of goals of information and media literacy programs.
A range of program designs aim to get to this goal. Thus the methods in this response mode are varied. This list is not exhaustive and we expect it to grow as more media and information literacy programs are added to the catalog.
One media literacy approach to achieve a better understanding about a topic or the media production process itself is through the creation or production of media messages. Proponents of the content production approach believe that ‘‘practical work (is) not an end in itself, but a necessary means to develop a critical understanding of the media”. The content produced could be corrective. The content can even be false or misinformation when part of an ‘inoculating’ intervention that aims to explain how misinformation is produced .
This is an approach in media literacy that focuses on the analysis and critique of media items. This approach aims to understand a media item in its economic, cultural, social, and historical context .
Lateral reading is a form of source evaluation; a specific heuristic to decide what websites to trust. The term for this method draws on the image of multiple browser tabs arrayed across the horizontal axis of the screen. Instead of first examining a site’s internal features (which are controlled by its designers), lateral readers evaluate unfamiliar sites by leaving them and turning to the open Web. The goal of this search is to investigate the organization or individual behind the original site .
Norms are principles of “right action binding upon the members of a group and serving to guide, control, or regulate proper and acceptable behavior’ (Merriam Webster). In our catalog, norms refer to both perceived descriptive norms (what most people do) as well as injunctive norms (what one ought to do) outside of the force of law . Thus, this tag includes reminders of social value ascribed to accuracy as well as “nudges” — a non-coercive device that leads people to certain decisions — towards being respectful and open in communication. Some approaches from conflict resolution may also be tagged under this tag.
Related concepts. When it comes to encouraging norms in online spaces, public shaming can be a problematic method . A key issue is the desire to maintain relationships between the people engaging in discussion.
Some methods associated with ‘encouraging norms’ are:
Social Norm Messaging
This method relies on nudging towards standards of behavior accepted by a community or group of people. ‘Most responsible people think twice before sharing articles’ is an example of a judge . This method enforces descriptive norms.
This is an act of persuasion that relies on appeals to morality. An example might be a message about what one “shouldn’t” do. The morals invoked could vary by group. For example, in the US appeals to values of ‘Care’ versus ‘Authority’ may differ across party lines .
Warning of Consequences
This reminds the person of the consequences for sharing inaccurate/abusive content. These consequences could be direct such as possible electoral loss  or indirect such as a reminder that family and acquaintances can also observe their messages .
This method enforces ‘accuracy’ as a norm when sharing online content. This method is distinct from ‘Social Norm Messaging’ since it doesn’t rely on a reminder of group validation of the importance of accuracy, but presents accuracy as an injunctive norm, or what one ought to do .
One definition of listen is “to hear something with thoughtful attention: give consideration” (Merriam Webster). While listening may not seem like much of a response, it is a critical part of a trust-building exchange, especially in situations where one is thinking about the possibilities for longer term dialogue or engagement outside the immediate message being discussed .
By listening silently, participants can understand more about whether to respond or how to respond. For example, it may be that the person you want to engage with isn’t really willing or ready to discuss differences of opinion . It can also be that responding to a query only with factual answers misses cues that the message writer is sending about problems they are having.
Listening can depend on the goals of the conversation. Several types have been defined by scholars. For example, listening actively, where participants can use techniques to solicit more information from the speaker , can be a form of focused attention on another person that is helpful in democratic exchange  . These are different from cataphatic listening, which focuses on how to respond in order to debate, defend, or critique .
Related concepts. One issue that has been noted related to active listening is that it may not be effective in all contexts. For example, marital researcher John Gottman and colleagues were surprised to see that this technique seemed, at times, to create more conflict; however, the subsequent debate related to two-person/dyadic and marital conversations revealed that the precise meaning of “active listening” itself can be understood differently.
Examples of listening techniques include:
As discussed above, though you might want to respond to those who are most negative about vaccination, it can be more effective to address those who are open to the information that you have to share .
Listening for Identity
Conflicts can be driven by issues of identity. Thus, from a conflict resolution or transformation perspective, it can help to pay attention to language, metaphors, and expressions that signal concerns around self-identity. Sometimes the connection to identity can be direct. But it may also be implied indirectly. Listening for identity requires looking beyond the content of what is being said .
Sharing is an action that seems synonymous to social media. But in the ARTT catalog, it is currently used in a specific way to imply a deeper engagement in a conversation. This response category is still under active development.
Sharing one’s own story is one way that people explain their reasoning through their own personal experience of navigating a difficult decision. Even in digital spaces, telling stories about one’s own “health journey” can be an effective way to share information while also encouraging reflection . Sharing information with shared values in mind can be a way to build trust, which is why we’re reviewing literature around knowledge sharing. In addition, when sharing complicated information, methods around communicating uncertainty may also fall under the Share response mode.
Take Perspective ("Perspective Taking")
Also phrased as “perspective taking”, this mode is the “act of viewing a situation from the point of view of others” . By doing this, people can identify another person’s intentions and needs even though they may not agree with them, which may reduce impasses and decrease discrimination. While empathizing involves sharing of others’ emotions, perspective taking is to help identify other’s intentions, needs, reactions, and behaviors.
The methods for taking perspective may be common with empathizing, since both emerge from dialogue. Some methods include:
A dyad is a group of two people. Dyadic communication occurs when two people, possibly from opposite sides of a conflict, have a direct interaction. Dyadic communication is not unique to perspective taking, but it is one kind of interaction that allows for the participants to take perspective .
Outparty Twitter Experience
One unique method encourages perspective by allowing users to experience Twitter as if they held the opposite political beliefs . This method can be extended to other social media platforms with personalized feeds.
Most interventions aim to change some aspect of human knowledge or behavior. Across different disciplines, we identify the following outcomes associated with research studies and practitioner handbooks:
Arguably, the goal of all trust building interventions is to lead to some kind of (positive) action. For example, an intervention related to building trust in vaccines ultimately hopes to lead to an increase in vaccination. Action might also refer to reduced sharing of inaccurate content; deletion of previously shared content; or efforts to check accuracy of content . While attitude may be assessed by self declaration, action has to be assessed by observing changes in real world/online interactions.
Attitude is the umbrella term to describe changes in perception and/or belief about a specific issue. This could be a change in belief about a political candidate or message , as well as a change in attitude towards vaccination. In conflict resolution, it could also imply a change in feelings towards a specific community .
Judgment can mean different things in different disciplines. In our catalog, judgment refers to a general ability to distinguish accurate from inaccurate information regardless of the topic . A change in ability to discern accurate or inaccurate information about a specific topic, such as smoking or climate change, is more likely captured under Attitude or Memory.
This outcome pertains to a general knowledge of advertising as well as a knowledge of specific media construction techniques used to persuade audiences . This outcome describes an increased understanding of the media production process  and should not be confused with knowledge about specific issues such as smoking or vaccination (see Attitude).
This outcome captures how specific facts or stories are remembered. This is the primary outcome for continued influence effect  evaluations that aim to understand how, if at all, corrections update a person’s knowledge about a specific topic. All recall and recognition related outcomes would be captured under the memory outcome tag.
In conflict resolution or transformation, the goal of an intervention can sometimes be to maintain the status quo, or to maintain the existing relationships between different groups even if they can’t be improved. We highlight this as a separate outcome because interventions don’t necessarily need to seek to improve a situation. This outcome may be revised in future iterations of the catalog.
Understanding of Others
This outcome emerges from conflict resolution literature when the goal of many programs is to improve interpersonal and intergroup relationships. Where an intervention is aimed towards changing attitude towards a specific group , this outcome overlaps with attitude. In such a situation, papers would be tagged with both the Attitude and Understanding of Others outcome tags.
 Amazeen, Michelle A., and Erik P. Bucy. 2019. “Conferring Resistance to Digital Disinformation: The Inoculating Influence of Procedural News Knowledge.” Journal of Broadcasting & Electronic Media 63 (3): 415–32. https://doi.org/10.1080/08838151.2019.1653101.
 Andı, Simge, and Jesper Akesson. 2021. “Nudging Away False News: Evidence from a Social Norms Experiment.” Digital Journalism 9 (1): 106–25. https://doi.org/10.1080/21670811.2020.1847674.
 Banerjee, Smita C., and Kathryn Greene. 2006. “Analysis Versus Production: Adolescent Cognitive and Attitudinal Responses to Antismoking Interventions.” Journal of Communication 56 (4): 773–94. https://doi.org/10.1111/j.1460-2466.2006.00319.x.
 Bechler, Christopher J., and Zakary L. Tormala. 2021. “Misdirecting Persuasive Efforts during the COVID-19 Pandemic: The Targets People Choose May Not Be the Most Likely to Change.” Journal of the Association for Consumer Research 6 (1): 187–95. https://doi.org/10.1086/711732.
 Berkowitz, Leonard. 1972. “Social Norms, Feelings, and Other Factors Affecting Helping and Altruism.” In Advances in Experimental Social Psychology, 6:63–108. Academic Press. https://doi.org/10.1016/S0065-2601(08)60025-8.
 Bickford, Susan. 2018. The Dissonance of Democracy: Listening, Conflict, and Citizenship. The Dissonance of Democracy. Cornell University Press. https://doi.org/10.7591/9781501722202.
 Bloom, Paul. 2017. “Empathy and Its Discontents.” Trends in Cognitive Sciences 21 (1): 24–31. https://doi.org/10.1016/j.tics.2016.11.004.
 Bojer, Marianne (“Mille”), Marianne Knuth, and Colleen Magner. 2006. “Mapping Dialogue: A Research Project Profiling Dialogue Tools and Processes for Social Change (Version 2.0).” Johannesburg, South Africa: Pioneers of Change Associates. http://www.mspguide.org/sites/default/files/resource/mapping_dialogue_-_a_research_project_profiling_dialogue_tools_and_processes.pdf.
 Cai, Deborah A., and Colleen Tolan. 2020. “Public Shaming and Attacks on Social Media: The Case of White Evangelical Christians.” Negotiation and Conflict Management Research 13 (3): 231–43. https://doi.org/10.1111/ncmr.12188.
 Compton, Joshua A., and Michael Pfau. 2005. “Inoculation Theory of Resistance to Influence at Maturity: Recent Progress In Theory Development and Application and Suggestions for Future Research.” Annals of the International Communication Association 29 (1): 97–146. https://doi.org/10.1080/23808985.2005.11679045.
 Dobson, Andrew. 2014. Listening for Democracy: Recognition, Representation, Reconciliation. Oxford University Press.
 Ecker, Ullrich K. H., Stephan Lewandowsky, John Cook, Philipp Schmid, Lisa K. Fazio, Nadia Brashier, Panayiota Kendeou, Emily K. Vraga, and Michelle A. Amazeen. 2022. “The Psychological Drivers of Misinformation Belief and Its Resistance to Correction.” Nature Reviews Psychology 1 (1): 13–29. https://doi.org/10.1038/s44159-021-00006-y.
 Eveland, William P., Kathryn D. Coduto, Osei Appiah, and Olivia M. Bullock. 2020. “Listening During Political Conversations: Traits and Situations.” Political Communication 37 (5): 656–77. https://doi.org/10.1080/10584609.2020.1736701.
 Gagneur, Arnaud. 2020. “Motivational Interviewing: A Powerful Tool to Address Vaccine Hesitancy.” Canada Communicable Disease Report 46 (4): 93–97. https://doi.org/10.14745/ccdr.v46i04a06.
 Gordon, Andrew, Ullrich K. H. Ecker, and Stephan Lewandowsky. 2019. “Polarity and Attitude Effects in the Continued-Influence Paradigm.” Journal of Memory and Language 108 (October): 104028. https://doi.org/10.1016/j.jml.2019.104028.
 Haigh, Carol, and Pip Hardy. 2011. “Tell Me a Story — a Conceptual Exploration of Storytelling in Healthcare Education.” Nurse Education Today 31 (4): 408–11. https://doi.org/10.1016/j.nedt.2010.08.001.
 Hangartner, Dominik, Gloria Gennaro, Sary Alasiri, Nicholas Bahrich, Alexandra Bornhoft, Joseph Boucher, Buket Buse Demirci, et al. 2021. “Empathy-Based Counterspeech Can Reduce Racist Hate Speech in a Social Media Field Experiment.” Proceedings of the National Academy of Sciences 118 (50): e2116310118. https://doi.org/10.1073/pnas.2116310118.
 Jeong, Se-Hoon, Hyunyi Cho, and Yoori Hwang. 2012. “Media Literacy Interventions: A Meta-Analytic Review.” The Journal of Communication 62 (3): 454–72. https://doi.org/10.1111/j.1460-2466.2012.01643.x.
 Jones-Jang, S. Mo, Tara Mortensen, and Jingjing Liu. 2019. “Does Media Literacy Help Identification of Fake News? Information Literacy Helps, but Other Literacies Don’t.” American Behavioral Scientist, August. https://doi.org/10.1177/0002764219869406.
 Klimecki, Olga M. 2019. “The Role of Empathy and Compassion in Conflict Resolution.” Emotion Review 11 (4): 310–25. https://doi.org/10.1177/1754073919838609.
 Klimecki, Olga M., Matthieu Vétois, and David Sander. 2020. “The Impact of Empathy and Perspective-Taking Instructions on Proponents and Opponents of Immigration.” Humanities and Social Sciences Communications 7 (1): 91. https://doi.org/10.1057/s41599-020-00581-0.
 Lederach, John Paul. 2003. The Little Book of Conflict Transformation. The Little Books of Justice & Peacebuilding. Intercourse, PA: Good Books.
 Maddison, Sarah. 2017. “Can We Reconcile? Understanding the Multi-Level Challenges of Conflict Transformation.” International Political Science Review / Revue Internationale de Science Politique 38 (2): 155–68.
 McGuire, William J. 1964. “Some Contemporary Approaches.” In Advances in Experimental Social Psychology, edited by Leonard Berkowitz, 1:191–229. Academic Press. https://doi.org/10.1016/S0065-2601(08)60052-0.
 Munger, Kevin. 2021. “Don’t @ Me: Experimentally Reducing Partisan Incivility on Twitter.” Journal of Experimental Political Science 8 (2): 102–16. https://doi.org/10.1017/XPS.2020.14.
 Nyhan, Brendan, and Jason Reifler. 2014. “The Effect of Fact-Checking on Elites: A Field Experiment on U.S. State Legislators.” American Journal of Political Science 59 (3): 628–40. https://doi.org/10.1111/ajps.12162.
 ———. 2015. “Displacing Misinformation about Events: An Experimental Test of Causal Corrections.” Journal of Experimental Political Science 2 (1): 81–93. https://doi.org/10.1017/XPS.2014.22.
 Office of the U.S. Surgeon General. 2021. “A Community Toolkit for Addressing Health Misinformation.” Office of the U.S. Surgeon General. SurgeonGeneral.gov/HealthMisinformation.
 Pennycook, Gordon, Jonathon McPhetres, Yunhao Zhang, Jackson G. Lu, and David G. Rand. 2020. “Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention.” Psychological Science, June. https://doi.org/10.1177/0956797620939054.
 Peter, Christina, and Thomas Koch. 2016. “When Debunking Scientific Myths Fails (and When It Does Not): The Backfire Effect in the Context of Journalistic Coverage and Immediate Judgments as Prevention Strategy.” Science Communication 38 (1): 3–25. https://doi.org/10.1177/1075547015613523.
 Rogers, Carl R., and Richard Evans Farson. 2015. Active Listening. Martino Fine Books.
 Roozenbeek, Jon, and Sander van der Linden. 2019. “Fake News Game Confers Psychological Resistance against Online Misinformation.” Palgrave Communications 5 (1). https://doi.org/10.1057/s41599-019-0279-9.
 Saveski, Martin, Nabeel Gillani, Ann Yuan, Prashanth Vijayaraghavan, and Deb Roy. 2022. “Perspective-Taking to Reduce Affective Polarization on Social Media.” Proceedings of the International AAAI Conference on Web and Social Media 16 (May): 885–95.
 Seo, Hyunjin, Joseph Erba, Darcey Altschwager, and Mugur Geana. 2019. “Evidence-Based Digital Literacy Class for Older, Low-Income African-American Adults.” Journal of Applied Communication Research 47 (2): 130–52. https://doi.org/10.1080/00909882.2019.1587176.
 Van Der Linden, Sander, Anthony Leiserowitz, Seth Rosenthal, and Edward Maibach. 2017. “Inoculating the Public against Misinformation about Climate Change.” Global Challenges 1 (2): 1600008. https://doi.org/10.1002/gch2.201600008.
 Van Til, Jon. 2011. “The Structure of Sustained Dialogue and Public Deliberation.” In Resolving Community Conflicts and Problems : Public Deliberation and Sustained Dialogue, edited by Roger A. Lohmann and Jon Van Til, 15–32. New York: Columbia University Press. https://search.ebscohost.com/login.aspx?direct=true&AuthType=ip,shib&db=nlebk&AN=953970&site=eds-live&scope=site&custid=gsu1.
 Walter, Nathan, and Riva Tukachinsky. 2020. “A Meta-Analytic Examination of the Continued Influence of Misinformation in the Face of Correction: How Powerful Is It, Why Does It Happen, and How to Stop It?” Communication Research 47 (2): 155–77. https://doi.org/10.1177/0093650219854600.
 Wineburg, Sam, Joel Breakstone, Sarah McGrew, Mark Smith, and Teresa Ortega. 2021. “Lateral Reading on the Open Internet,” November. https://doi.org/10.2139/ssrn.3936112.
 Wintersieck, Amanda, Kim Fridkin, and Patrick Kenney. 2021. “The Message Matters: The Influence of Fact-Checking on Evaluations of Political Messages.” Journal of Political Marketing 20 (2): 93–120. https://doi.org/10.1080/15377857.2018.1457591.
 Wojcieszak, Magdalena, and Benjamin R. Warner. 2020. “Can Interparty Contact Reduce Affective Polarization? A Systematic Test of Different Forms of Intergroup Contact.” Political Communication 37 (6): 789–811. https://doi.org/10.1080/10584609.2020.1760406.