Introduction
In the interconnected landscape of the 21st century, the proliferation of misinformation and conspiracy theories has emerged as a formidable challenge with far-reaching consequences for societies worldwide (Biddlestone et al., 2022; Bierwiaczonek et al., 2022; Bierwiaczonek et al., 2024; Borges do Nascimento et al., 2022; Swire-Thompson & Lazer, 2020; Treen et al., 2020). The insidious nature of these phenomena lies in their ability to distort perceptions, fuel polarization, and erode the foundations of informed public discourse and democratic institutions. As these threats continue to evolve and adapt, it is imperative that we deepen our understanding of the psychological mechanisms that underlie susceptibility to misinformation and develop evidence-based strategies to cultivate resilience at both the individual and societal levels.
While recent research has shed light on the multifaceted factors that contribute to the spread and acceptance of misinformation and conspiracy theories (Hornsey et al., 2023; Kunst et al., 2024; Stasielowicz, 2022), there remains a pressing need to bridge the gap between identifying vulnerabilities and implementing effective interventions. The path forward is fraught with complexities and controversies, underscoring the importance of rigorous, multidisciplinary approaches that integrate insights from psychology, social sciences, and beyond.
This special issue, titled “From vulnerability to vigilance,” aims to advance the frontiers of our knowledge by bringing together cutting-edge empirical findings, theoretical and sometimes profoundly critical perspectives, and innovative strategies to foster resilience against the pervasive influence of misinformation and conspiracy theories. Our special issue featured eleven cutting-edge articles. These can be divided into three different categories of contributions as outlined below.
Reconciling Conflicting Findings and Improving the Field
Several studies aimed to resolve conflicting findings. Lawson and Kakkar (2024) provide a meta-analysis to explore how different research approaches have led to conflicting findings about misinformation sharing, focusing particularly on the role of personality traits and political ideology. They find that conscientiousness moderates the relationship between ideology and fake news sharing, but this effect varies based on how ideology is measured and analyzed. The authors find that conscientiousness consistently weakens the relationship between some (but not all) measures of ideology when predicting fake news sharing. To help resolve research contradictions, they propose a framework that considers key factors like ideology measures, analytical approaches, and stimulus selection. Notably, they argue that a variable’s relevance to understanding fake news sharing does not require it to interact with news veracity (a point this editorial will return to later). The authors offer recommendations for improving research methodology and present new perspectives on conceptualizing interventions in misinformation research.
Mang et al. (2024) reconcile research on source credibility in misinformation research, conducting a comprehensive systematic review. The authors found substantial inconsistencies in how source credibility influences responses to misinformation. Their analysis revealed several factors contributing to these inconsistencies, including conceptual issues like the conflation of different source credibility dimensions (expertise, trustworthiness, and bias); methodological variations in how source credibility was operationalized; differences in stimulus materials’ relevance; and varying outcome measures (cognitive, behavioral, and evaluative responses). The authors propose that the relative relevance of source credibility information compared to substantive content may determine its influence on judgments. Drawing on persuasion research frameworks, they provide recommendations for more systematic investigation of source credibility effects in misinformation contexts, including better conceptual distinction between credibility dimensions and careful consideration of information relevance and processing factors.
Roozenbeek et al. (2024) present a critical analysis of misinformation intervention research, highlighting six key challenges in translating laboratory efficacy to real-world effectiveness. The authors argue that while lab studies often show promising results for interventions aimed at countering misinformation, several issues limit their real-world impact: (1) an overreliance on lab research versus field studies, (2) testing effects that artificially boost intervention longevity, (3) modest effects that reach only small portions of relevant audiences, (4) overemphasis on item evaluation tasks as success metrics, (5) poor replicability in the Global South and lack of audience-tailored approaches, and (6) insufficient consideration of unintended consequences. The authors provide practical recommendations including investing in field research, exploring innovative ways to incorporate rehearsal, tailoring interventions for specific audiences, expanding outcome measures beyond item evaluation tasks, and co-designing interventions with local partners. The paper advocates for a broader framework for assessing intervention effectiveness that goes beyond traditional lab metrics and considers real-world impact, user engagement, and cultural context. The authors emphasize that while misinformation interventions can be valuable tools, their implementation requires careful consideration of these challenges to achieve meaningful results at scale.
In terms of methodological advances, Tay et al. (2024) argue for expanding the toolkit used in misinformation and conspiracy theory research beyond traditional laboratory experiments and observational studies. They present a counterfactual framework for causal inference and introduce four key methodological approaches that are currently underutilized in the field: instrumental-variable analysis, regression-discontinuity design, difference-in-differences, and synthetic control designs. The authors contend that while randomized experiments remain the “gold standard,” they are not always feasible due to ethical or practical constraints. The proposed alternative methods can help researchers draw causal inferences from natural experiments and observational data when randomized experiments are not possible. For example, instrumental-variable analysis can help isolate causal effects when there are unmeasured confounds, while regression-discontinuity designs can leverage natural thresholds or cutoff points to approximate randomized conditions. The paper emphasizes that these methods should complement rather than replace existing approaches, and that their validity depends on carefully considering underlying assumptions. The authors provide real-world examples of each method’s application in misinformation research and argue that adopting these approaches could lead to better integration across disciplines studying misinformation and conspiracy theories, ultimately resulting in more comprehensive theories and more effective interventions.
Proposing Models and Testing Intervention Approaches
Several papers introduce models and test novel interventions. Pretus et al. (2024) propose strategies for scaling crowdsourcing interventions to combat partisan misinformation by balancing three key factors: cognitive dissonance with previous beliefs, trust in fact-checking sources, and crowd size. The authors argue that while crowdsourcing can be effective in countering misinformation, its success depends on conveying dissonant feedback from a sufficiently large and trusted crowd. This is particularly important for reaching extreme partisans who share the most misinformation but also deeply mistrust traditional fact-checking sources. The paper presents two main implementation approaches: (1) the “Centralist” approach, which selects fact-checkers based on their central network position and multiple group memberships, and (2) the “Two Steps Away” approach, which displays fact-checks from users who are outside one’s immediate ideological bubble but not too distant. Both strategies aim to optimize the three key factors while ensuring transparency and preventing abuse of fact-checking capabilities. The authors emphasize that these approaches come with important considerations and limitations that require further research, including: how to label fact-checkers without undermining their effectiveness, what percentage of users across ideological communities would actually engage in fact-checking, and how to adapt these strategies for different cultural contexts and social media platforms. They suggest that implementing such systems could provide a fast first-line response to misinformation while promoting shared responsibility for content moderation.
Ziemer et al. (2024) conducted a preregistered experimental study examining the susceptibility to pro-Russian disinformation about the war in Ukraine among Germans with and without a Russian migration background. They found that having a Russian identity and being exposed to Russian media was positively correlated with heightened susceptibility to pro-Russian disinformation. However, they also demonstrated that inoculation, a psychological intervention that warns people about persuasive attempts and preemptively refutes arguments, improved participants’ ability to recognize disinformation, perceive it as less credible, heighten perceptions of Russia’s responsibility for the war, and strengthen solidarity with Ukraine. Importantly, the inoculation effects on reducing disinformation susceptibility were not significantly impaired by Russian identity.
Traberg et al. (2024) investigated the efficacy of an emotion-fallacy inoculation intervention in reducing susceptibility to emotionally misleading news and explored the impact of persuasive social cues on its effectiveness. The results showed that the inoculation significantly reduced the perceived reliability of misinformation, enhanced participants’ confidence in their reliability judgments, and improved veracity discernment. However, the study also found that social cues increased the perceived reliability and accuracy of misinformation, even among inoculated individuals. The impact of inoculation remained consistent though, suggesting that while social cues enhance the persuasiveness of misinformation, they do not diminish the effectiveness of the inoculation intervention. Additionally, participants acknowledged the influence of social cues more on others than on themselves, indicating a third-person consensus effect. The findings highlight the persistent influence of social cues, even in the presence of inoculation, emphasizing the need for nuanced interventions to address the complex interplay between emotions, misinformation, and social influence in the digital age.
Bowes and Fazio (2024) conducted a multi-level meta-analytic review synthesizing the growing body of research on the relations between intellectual humility and misinformation receptivity. They found that intellectual humility was related to less misinformation receptivity for beliefs, greater intentions to move away from misinformation, and greater engagement in evidence-based behaviors. Effect sizes were generally small and heterogeneous, with moderator analyses revealing that effects were stronger for comprehensive measures of intellectual humility assessing metacognitive, relational, and emotional features compared to narrow measures focusing only on metacognitive features. Relations also varied across misinformation types, with effects being strongest for pseudoscience beliefs, especially anti-vaccination attitudes and COVID-19 beliefs. The authors highlight that while the correlational effects are small to moderate, experimental research should examine whether intellectual humility interventions can causally reduce misinformation receptivity, especially among those most at risk for believing and acting on misinformation.
A Focus on the Role of Information Processing and Response Biases
A topic receiving increasing attention is whether effects of interventions and preventive individual differences are due to biases in information processing or even response biases that may reduce belief in both false and true information. Several papers in this special issue address this topic. Robson et al. (2024) compared how believers (i.e., “Fringe”) and nonbelievers (i.e., “Mainstream”) of implausible claims reason about evidence to test two hypotheses: the Miserly Hypothesis, which posits that Fringe believers are cognitive misers who avoid mental effort when evaluating information, and the Information Preference Hypothesis, which suggests Fringe believers have an alternative epistemic framework and differ in what evidence they consider credible. Across two studies, participants read a fictitious high or low quality expert report, rated the expert’s persuasiveness, and provided open-ended responses justifying their ratings. The authors analyzed the quantity and quality of these responses. There was mixed evidence for the Miserly Hypothesis. Fringe believers’ responses were shorter than Mainstream believers’ in Study 1 but not Study 2 or the combined data. However, Fringe believers consistently provided fewer justifications referencing normative indicators of evidence quality compared to Mainstream believers, supporting the Information Preference Hypothesis. There was also weak evidence that Fringe believers provided more self-generated justifications not based on the expert report. The results suggest Fringe believers rely less on conventional expertise-relevant cues when reasoning about evidence, potentially stemming from different assumptions about what information is important. The authors propose that contrasting beliefs on topics like climate change and vaccines may arise from downplaying high-quality information. Educating people about normative epistemic indicators and tailoring information to fit Fringe believers’ frameworks are highlighted as potential interventions.
Prike et al. (2024) examined the relationship between intellectual humility and the ability to discern between true and false news headlines, as well as metacognitive discernment (the ability to distinguish between one’s own correct and incorrect truthfulness judgments). Participants rated the truthfulness of news headlines and then decided whether to report or withhold each truthfulness judgment. They also completed three intellectual humility scales. The authors used signal detection theory (Banks, 1970; McNicol, 2004) to disentangle discernment and response bias for both the initial truthfulness judgments (misinformation discernment) and the decision to report/withhold those judgments (metacognitive discernment). Consistent with their hypotheses, intellectual humility was associated with greater misinformation discernment (ability to distinguish true from false headlines) but not response bias. Intellectual humility was also generally associated with greater metacognitive discernment, although this relationship was less robust. The findings suggest intellectual humility is associated with reduced misinformation susceptibility due to improved discernment rather than response bias. The positive relationship between self-reported intellectual humility and metacognitive discernment also supports the validity of the intellectual humility scales. The authors propose that interventions to increase intellectual humility may be a promising approach to counter misinformation.
O’Mahony et al. (2024) compared the effectiveness of four conspiracy belief interventions (Priming, Inoculation, Active Inoculation, and Discernment) across two studies. They found that the inoculation-based interventions, but not Priming and Discernment conditions, reduced susceptibility to novel implausible conspiracy theories but did not improve critical appraisal of novel plausible conspiracy theories. Importantly, only the Discernment condition, which discouraged blind skepticism of conspiracy theories, significantly improved critical appraisal of both plausible and implausible conspiracy theories. The purely inoculation-based interventions were moderately successful at reducing epistemically unwarranted beliefs, but no intervention significantly reduced general conspiracy ideation or likelihood judgments for hypothetical conspiracy theories. In terms of response biases, the authors found that many well-established interventions designed to reduce belief in conspiracy theories may actually encourage a blind skepticism response bias, causing participants to dismiss conspiracy theories regardless of their plausibility. However, the Discernment condition improved discernment between plausible and implausible theories without promoting this blind skepticism response bias. Signal detection analyses on the outcome measures also suggested the Discernment intervention improved participants’ ability to distinguish plausible and implausible theories rather than simply encouraging an accept or reject response bias.
Importantly, the aforementioned article by Lawson and Kakkar (2024) offers a thought-provoking and nuancing perspective on the issue of discernment. The authors use simulations to show that even if an intervention variable like conscientiousness has the same effect on reducing sharing of both real and fake news, it can still substantially impact key outcomes like the proportion of shared stories that are fake. This is because false stories are shared at a much lower base rate than true stories. So equivalent reductions in absolute terms translate to larger reductions in relative terms for false stories. In other words, the authors argue against the notion that for an intervention to be relevant to misinformation, it must reduce belief in false news significantly more than it reduces belief in true news. They suggest a more holistic view is needed, using simulations to examine the net impact of interventions on ecosystem-level outcomes beyond just examining interaction coefficients. The mere presence of a response bias (i.e. an overall tendency to believe less news regardless of veracity) may not negate an intervention’s importance.
Conclusion
The eleven articles in this special issue collectively advance our understanding of misinformation susceptibility and resistance while highlighting critical directions for future research. The works demonstrate the importance of rigorous methodology, with several papers reconciling conflicting findings and proposing improved research frameworks and interventions. As the field moves forward, researchers must continue to balance methodological rigor with real-world applicability, while considering how interventions can be scaled effectively across different cultural contexts and platforms. This special issue provides a foundation for future work that bridges theoretical insights with practical solutions to combat the persistent challenge of misinformation in our interconnected world.
Conflicts of Interest
The author declare no competing interests.
Acknowledgements
The language in this editorial was improved using AI tools. The author takes full responsibility for its content.
References
Banks, W. P. (1970). Signal detection theory and human memory. Psychological Bulletin, 74(2), 81-99. https://doi.org/10.1037/h0029531
Biddlestone, M., Azevedo, F., & van der Linden, S. (2022). Climate of conspiracy: A meta-analysis of the consequences of belief in conspiracy theories about climate change. Current Opinion in Psychology, 46, 101390. https://doi.org/10.1016/j.copsyc.2022.101390
Bierwiaczonek, K., Gundersen, A. B., & Kunst, J. R. (2022). The role of conspiracy beliefs for COVID-19 health responses: A meta-analysis. Current Opinion in Psychology, 46, 101346. https://doi.org/10.1016/j.copsyc.2022.101346
Bierwiaczonek, K., van Prooijen, J.-W., van der Linden, S., & Rottweiler, B. (2024). Conspiracy theories and violent extremism. In M. Obaidi & J. R. Kunst (Eds.), The Cambridge handbook of the psychology of violent extremism. Cambridge University Press.
Borges do Nascimento, I. J., Pizarro, A. B., Almeida, J. M., Azzopardi-Muscat, N., Gonçalves, M. A., Björklund, M., & Novillo-Ortiz, D. (2022). Infodemics and health misinformation: a systematic review of reviews. Bulletin of the World Health Organization, 100(9), 544-561. https://doi.org/10.2471/blt.21.287654
Bowes, S. M., & Fazio, L. K. (2024). Intellectual humility and misinformation receptivity: A meta-analytic review. advances.in/psychology, 2, e940422. https://doi.org/10.56296/aip00026
Hornsey, M. J., Bierwiaczonek, K., Sassenberg, K., & Douglas, K. M. (2023). Individual, intergroup and nation-level influences on belief in conspiracy theories. Nature Reviews Psychology, 2(2), 85-97. https://doi.org/10.1038/s44159-022-00133-0
Kunst, J. R., Gundersen, A. B., Krysińska, I., Piasecki, J., Wójtowicz, T., Rygula, R., van der Linden, S., & Morzy, M. (2024). Leveraging artificial intelligence to identify the psychological factors associated with conspiracy theory beliefs online. Nature Communications, 15(1), 7497. https://doi.org/10.1038/s41467-024-51740-9
Lawson, M. A., & Kakkar, H. (2024). Resolving conflicting findings in misinformation research: A methodological perspective. advances.in/psychology, 2, e235462. https://doi.org/10.56296/aip00031
Mang, V., Fennis, B. M., & Epstude, K. (2024). Source credibility effects in misinformation research: A review and primer. advances.in/psychology, 2, e443610. https://doi.org/10.56296/aip00028
McNicol, D. (2004). A primer of Signal Detection Theory (1st ed.). Psychology Press. https://doi.org/10.4324/9781410611949
O’Mahony, C., Murphy, G., & Linehan, C. (2024). True discernment or blind scepticism? Comparing the effectiveness of four conspiracy belief interventions. advances.in/psychology, 2, e215691. https://doi.org/10.56296/aip00030
Pretus, C., Gil-Buitrago, H., Cisma, I., Hendricks, R. C., & Lizarazo-Villarreal, D. (2024). Scaling crowdsourcing interventions to combat partisan misinformation. advances.in/psychology, 2, e85592. https://doi.org/10.56296/aip00018
Prike, T., Holloway, J., & Ecker, U. K. H. (2024). Intellectual humility is associated with greater misinformation discernment and metacognitive insight but not response bias. advances.in/psychology, 2, e020433. https://doi.org/10.56296/aip00025
Robson, S. G., Faasse, K., Gordon, E.-R., Jones, S. P., Drew, M., & Martire, K. A. (2024). Lazy or different? A quantitative content analysis of how believers and nonbelievers of misinformation reason. advances.in/psychology, 2, e003511. https://doi.org/10.56296/aip00027
Roozenbeek, J., Remshard, M., & Kyrychenko, Y. (2024). Beyond the headlines: On the efficacy and effectiveness of misinformation interventions. advances.in/psychology, 2, e24569. https://doi.org/10.56296/aip00019
Stasielowicz, L. (2022). Who believes in conspiracy theories? A meta-analysis on personality correlates. Journal of Research in Personality, 98, 104229. https://doi.org/10.1016/j.jrp.2022.104229
Swire-Thompson, B., & Lazer, D. (2020). Public Health and Online Misinformation: Challenges and Recommendations. Annual Review of Public Health, 41(Volume 41, 2020), 433-451. https://doi.org/10.1146/annurev-publhealth-040119-094127
Tay, L. Q., Hurlstone, M., Jiang, Y., Platow, M. J., Kurz, T., & Ecker, U. K. H. (2024). Causal inference in misinformation and conspiracy research. advances.in/psychology, 2, e69941. https://doi.org/10.56296/aip00023
Traberg, C., Morton, T., & van der Linden, S. (2024). Counteracting socially endorsed misinformation through an emotion-fallacy inoculation. advances.in/psychology, 2, e765332. https://doi.org/10.56296/aip00017
Treen, K. M. d. I., Williams, H. T. P., & O’Neill, S. J. (2020). Online misinformation about climate change. WIREs Climate Change, 11(5), e665. https://doi.org/https://doi.org/10.1002/wcc.665
Ziemer, C.-T., Schmid, P., Betsch, C., & Rothmund, T. (2024). Identity is key, but Inoculation helps – how to empower Germans of Russian descent against pro-Kremlin disinformation. advances.in/psychology, 2, e628359. https://doi.org/10.56296/aip00015