Introduction
The Russian war against Ukraine is not only fought on the battlegrounds. It is accompanied by an infowar about the credibility of Ukraine and the culpability of Russia in mass media. As Ukraine’s resistance against the Russian invasion heavily depends on external military supplies, Russia has been targeting international solidarity with Ukraine by spreading disinformation in many countries worldwide, including Germany (Atlantic Council, 2023a; Yablokov, 2022). Russia’s history of employing strategic disinformation narratives through governmental and public media channels to inform political events at home but also internationally can be traced back to the Soviet era (Rid, 2021; Yablokov, 2022). But especially in the years leading to the attempted full-scale invasion of Ukraine, Russia tried to undermine the autonomy of Ukraine on all fronts with Facebook being a key platform of disinformation operations (Atlantic Council, 2023a; Hanlon, 2018). Even though the main communication channels of Russian disinformation such as RT and Sputnik have been banned in the EU shortly after the full-scale invasion started, Russia is still targeting foreign audiences through social media and alternative media channels – also by exploiting regional disinformation vulnerabilities (Pierri et al., 2023). Societal and individual vulnerabilities to Russian disinformation can be constituted for example by having family ties to Russia or being socialized in a country with a historically strong anti-Western sentiment (Atlantic Council, 2023a). In line with the Identity-Based Model of Political Belief (van Bavel & Pereira, 2018) these are relevant features to construct one’s own social identity and therefore might heighten individuals’ susceptibility to Russian disinformation narratives.
In this study, we aim to analyse if a Russian identity is associated with a higher susceptibility towards Russian disinformation (recognition and credibility perceptions) and resulting attitudes regarding the Russian war in Ukraine (ascription of responsibility for the war and solidarity with Ukraine). Moreover, we test whether inoculation, an effective psychological intervention against disinformation (Lu et al., 2023), can empower individuals to be less susceptible towards pro-Russian disinformation narratives and protect from attitudinal shifts. Finally, we explore whether and how the effect of inoculation is affected by Russian identity. With an exploratory analysis, we illuminate the role of exposure to Russian media for disinformation susceptibility and resulting attitudes. With this study, we contribute to a better understanding of risk and protective factors when individuals are confronted with identity-related, political disinformation.
Pro-Russian Disinformation, its Characteristics and Scope
We define pro-Russian disinformation as misleading, inaccurate or entirely false information which strategically tries to sway the public’s opinion in favour of the Russian narrative, while its adversaries, Ukraine and its Western Allies, are delegitimized and demonized in their intentions and actions (Atlantic Council, 2023b; Ziemer & Rothmund, 2024). In contrast to misinformation, disinformation is spread intentionally. When intent is not detectable, scholars usually refer to the term misinformation to describe a broad range of information disorders (Pennycook & Rand, 2021; Wardle & Derakhshan, 2017). However, facing a hybrid warfare, it is evident to assume that the conflicting parties spread disinformation with intent. In the case of the Russian war against Ukraine, Russia aims to justify military action, and tries to maintain support for the war within the Russian population. Moreover, Russia denies any responsibility for the war, demoralizes Ukrainian audiences, undermines the belief of Ukrainians and third parties in Ukraine’s power to resist and tries to derail international humanitarian and military support (Atlantic Council, 2023a).
Different organizations and investigative journalists (e.g., EUvsDisinfo, Atlantic Council, OECD) have been monitoring pro-Russian disinformation narratives for many years now. Three prominent pro-Russian disinformation narratives in the context of the Russian war against Ukraine are: (1) “It did not occur”, i.e. military operations and war crimes against Ukrainian citizens are denied and disguised as Ukrainian fabrications; (2) “It was the West”, i.e. the responsibility for the escalation of the conflict is attributed to the Western countries and NATO; and (3)“They are doing the same thing”, i.e. military operations and war crimes are legitimized by pointing to war crimes pursued by other nations in the past, e.g. the invasion of Iraq by the USA and its allies (EUvsDisinfo, 2023; OECD, 2022; ARTE, n.d.). Contrary to many other examples of disinformation campaigns, the Russian state itself is the main disinformation spreader using not only governmental communication but public media channels to consolidate its authority and narratives (Paul & Matthews, 2016; Ryzhova, 2024). This means that people consuming Kremlin-friendly Russian media are constantly confronted with disinformation about the war in Ukraine.
The goal of disinformation is not only to confuse recipients’ perception of which information is correct and which is incorrect, but ultimately to achieve a shift in attitudes and resulting behaviour such as public support or voting preferences. The worldwide operating Russian propaganda machine is very elaborated in that pursuit. Pro-Russian troll farms target especially non-Western countries with information that aims to create anti-Western sentiments and distrust against the West among its citizens, such as in West Africa and Latin America. They post in local languages, tap into local agendas, and share anti-US narratives, which present Russia and its role in the war in a favourable light (Atlantic Council, 2023a). Meanwhile, pro-Russian trolls hide any connection to the Kreml (Yablokov, 2022). Western countries that are more directly involved in supporting Ukraine are targeted as well. For example, in Poland, which is hosting a large number of Ukrainian refugees, Russian trolls spread disinformation narratives which depict Ukrainian refugees as ungrateful and criminal in order to lower the hospitality and support within the Polish society (Atlantic Council, 2023a). Germany has also frequently been subjected to Russian disinformation campaigns, strategically designed to magnify internal conflicts. Prominent instances include the so-called “Lisa case”, a fabricated story in 2016 about a Russian-German[1] girl allegedly raped by migrants, aiming to intensify racist sentiments (Meister, 2016). Another example is an orchestrated disinformation campaign on the social media platform X in early 2024, involving more than 50.000 fake accounts of Russian origin that tried to leverage negative sentiments towards the German government and its support for Ukraine (Vojta, 2024).
Although it is hard to estimate the exact impact of these orchestrated attacks, there is experimental evidence that disinformation generally has the potential to influence perception, attitudes and behaviour (Roozenbeek & van der Linden, 2024). Evidence about the real-world consequences of these campaigns is mixed. Russia’s effort to influence the US presidential election of 2016 by spreading disinformation and fueling polarization on Twitter is now considered to have had no measurable effect on election outcomes (Eady et al., 2023). However, analysts believe that Russian disinformation has strongly impacted the recent political developments in Mali (BBC, 2023).
Social Identity and pro-Russian Disinformation
In their Identity-based Model of Political Belief, van Bavel and Pereira (2018) highlight the importance of social identity for the perception and evaluation of factual information and evidence. According to social identity theory (Tajfel & Turner, 1986), people define themselves not only based on individual characteristics but also on their membership in various social groups, for example of national, ethnic or partisan nature. Social identity can serve important needs and goals such as self-esteem and meaning (Tajfel & Turner, 1986). Therefore, aligning one’s own perception and judgement with beliefs and attitudes that are prevalent in the social group can be more beneficial than being accurate.
A plethora of research demonstrates that strong partisans are ideologically skewed when interpreting factual evidence (Kahan et al., 2017; Washburn & Skitka, 2018) and are more likely to fall for politically congruent disinformation (Pereira et al., 2021). In a recent study using a sample from Ukraine, Erlich and Garner (2023) could show that individuals with partisan and ethnolinguistic ties to Russia are more likely to believe pro-Russian disinformation. This vulnerability got addressed by the Russian state propaganda for many years, e.g. by establishing pro-Kremlin Russian-speaking TV channels spreading pro-Kremlin narratives in Ukraine (Atlantic Council, 2023a).
Are Germans with a Russian Migration Background Especially Vulnerable?
Germany posits an interesting context for examining identity-based vulnerabilities and resilience against Russian disinformation. Between 1990 and 2020 approximately 600.000 individuals with a German heritage immigrated from Russia to Germany, many of Jewish descent (Bundeszentrale für politische Bildung, 2022). After Turkey and Poland, Russia is the third most common country of origin for Germans with a migration background (Destatis, 2023). Germans with a Russian migration background are a heterogenous group. They are slightly better educated than Germans without a Russian migration background, however, they tend to work in jobs below their qualification level. Although right-wing populist parties such as the AfD are courting them very strongly, polls indicate that Germans with and without a Russian migration background do not differ much in terms of voting preferences (Kiefer et al., 2021; Panagiotidis, 2017a). Germans with a Russian background consume both German and Russian media outlets, however, they tend to distrust German media outlets and feel misrepresented (Decker, 2020; Ryzhova, 2024; WDR, 2024). Especially in the first generation of migrants, having both a Russian and a German heritage posed an integration difficulty both in Russia (where Russians with a German heritage were perceived as German) and Germany (where they were perceived as Russian) and was often described as a feeling of “double foreignness” (Panagiotidis, 2017b). Although research showed that holding more than one identity can also be psychologically beneficial, dual identities can be challenging especially when their identity goals seem to be incompatible (Simon et al., 2013). Almost thirty years after the majority of Russians with German heritage immigrated to Germany, the Russian war against Ukraine could again lead to feelings of “double foreignness”, since the majority of Germans support Ukraine and condemn Russia, while Russians do the opposite (Infratest Dimap, 2023; Volkov & Kolesnikov, 2023). The additional distrust in German mainstream media paired with still significant exposure to state-sponsored Russian media outlets could all contribute to a greater vulnerability towards pro-Russian disinformation.
We test the following preregistered identity hypotheses[2]:
- H1: The ability to detect aspects of propaganda will be lower for Germans of Russian descent than for Germans without Russian descent.
- H2: The perceived credibility of disinformation will be higher for Germans of Russian descent than for Germans without Russian descent.
- H3: The belief in the responsibility of Russia for the war in Ukraine will be lower for Germans of Russian descent than for Germans without Russian descent.
- H4: Solidarity with Ukraine will be lower for Germans of Russian descent than for Germans without Russian descent.
Inoculation as an Effective Mean to Counter pro-Russian Disinformation
Policymakers, think tanks and other initiatives such as EUvsDisinfo (https://euvsdisinfo.eu/) aim to effectively address pro-Russian disinformation narratives and empower citizens against their persuasive nature. One of the most established theories on resistance to disinformation is the inoculation theory (Lewandowsky & van der Linden, 2021; McGuire, 1964). Inoculation builds mental resistance against content or strategies of disinformation combining two different mechanisms. First, individuals are warned about upcoming persuasive attempts (threat induction). Second, individuals are educated about the arguments or strategies used in these attempts and how to counter these (pre-emptive refutation). Over the years, inoculation has been administered in a number of ways ranging from text-based, gamified (where the intervention is embedded in a fun online game, e.g. https://www.getbadnews.com/en) to video-based interventions (e.g. https://www.youtube.com/watch?v=p3OPIT1xR5c&ab_channel=InfoInterventions; Basol et al., 2020; Roozenbeek et al., 2022). It has been proven as a powerful tool against Russian astroturfing attacks in the German context (Zerback et al., 2021). Inoculation videos on disinformation about Ukrainian refugees implemented on YouTube in Poland, Czechia and Slovakia led to an increase in correctly identified disinformation strategies among the audience (Jigsaw, 2023). In this study, we test a text-based, strategic inoculation that first warns participants about pro-Russian disinformation and then demasks three prominent disinformation narratives by explaining which misleading strategies are applied and why they are misleading. Other than in many other text-based approaches, we designed a graphically appealing inoculation sheet to ease accessibility.
We test the following preregistered inoculation hypotheses:
- H5: The ability to detect aspects of propaganda will be higher for inoculated individuals compared to the control group (no-inoculation).
- H6: The perceived credibility of disinformation will be lower for inoculated individuals compared to the control group (no-inoculation).
- H7: The belief in the responsibility of Russia for the war in Ukraine will be higher for inoculated individuals compared to the control group (no-inoculation).
- H8: Solidarity with Ukraine will be higher for inoculated individuals compared to the control group (no-inoculation).
Inoculation does not seem to work equally effectively across different audiences. While experiments with Western samples are mostly shown to be effective, replications with samples from non-Western countries such as India were less successful (Harjani et al., 2023). Should Germans of Russian descent identify with Russia, it could be expected that an inoculation intervention against pro-Russian disinformation is less effective for this group. However, there is also evidence that interventions against disinformation are equally (Traberg et al., 2022) or even more effective among vulnerable groups (Schmid & Betsch, 2019). Given the mixed pattern across studies, we have no directed hypotheses but explore whether inoculation works equally well for participants with and without a Russian migration background using the following research question:
- RQ1: Is the effectiveness of inoculation a function of individuals’ identity?
The Present Research
In this preregistered experimental study (https://aspredicted.org/FLT_HKL), we aim to investigate whether having a Russian migration background (social identity) is associated with a higher susceptibility to pro-Russian disinformation. Specifically, we examine participants’ abilities in (1) recognizing pro-Russian disinformation as such, (2) perceiving it as credible, (3) attributing responsibility to Russia for the outbreak of the war, and (4) expressing solidarity with Ukraine. Furthermore, we seek to determine whether inoculation can minimize susceptibility to pro-Russian disinformation for the aforementioned dependent measures (1-4). Finally, we explore whether the inoculation effect is less pronounced in participants with a Russian migration background compared to those without. In our exploratory analyses, we investigate (a) the influence of high exposure to Russian media on our dependent variables one to four and whether high exposure moderates the relationship between inoculation and these variables.
Method
Ethics Information
Ethics approval for the study was obtained from the institutional review board at Friedrich Schiller University Jena, Germany.
Sample
We recruited two samples, one quota-based (age x sex, education, voting behaviour) German sample with no Russian migration background (n = 294) and one non-quota-based German sample with a Russian migration background (n = 303).[3] Both samples were recruited via a panel company (bilendi.com). In total, 2,728 participants gave their consent to take part in the survey. Of these, 2,068 individuals were excluded due to full quotas. Five participants were excluded because they were minors and another 58 participants failed an attention check. The final number of N = 597 participants completed the survey (Mage = 42.9 years, SDage = 16.1, 61% female) with a median time of 12.4 minutes. The survey language was German. Participants who indicated either no migration background or a migration background other than Russian were considered part of the Non-Russian German sample. Participants who indicated a Russian migration background in the first, second or third generation were considered part of the Germans of Russian descent sample. In total, 51.2 % indicated to be first-generation migrants, 33.7 % second-generation migrants and 15.2 % third-generation migrants from Russia (see Table 1 for a demographic comparison between non-Russian Germans vs. Germans of Russian descent).
Table 1 Demographic Comparison Between Non-Russian Germans vs. Germans of Russian Descent | ||
Non-Russian Germans | Germans of Russian descent | |
Age* | M = 49.68 years, SD = 17.03 | M = 36.41 years, SD = 12.07 |
Sex* | 51.7 % female | 70 % female |
Education* | 28.2 % low education, 32.3 % intermediate education, 39.5 % high education | 10.2 % low education, 25.7 % intermediate education, 64 % high education |
Political orientation | M = 5.89, SD = 2.02 | M = 5.98, SD = 1.84 |
ID with Europe | M = 4.32, SD = 2.08 | M = 4.35, SD = 2.05 |
ID with Ukraine | M = 2.51, SD = 1.73 | M = 2.24, SD = 1.59 |
ID with Russia* | M = 1.93, SD = 1.68 | M = 3.54, SD = 2.04 |
Design, Intervention and Disinformation Postings
The study used a 2 (sample: Non-Russian Germans vs. Germans of Russian descent) x 2 (intervention: inoculation vs. control) experimental design. Participants from both samples were randomly assigned to either the inoculation condition or the control condition. In both conditions, participants were presented with an informational sheet. The sheet either provided information about three Russian disinformation narratives (inoculation condition) or about healthy living (control condition). The sheets were designed specifically for this study. They were presented to the participants for at least 60 seconds. In line with inoculation theory, the inoculation sheet first warned about Russian disinformation and then informed about three prominent pro-Russian disinformation narratives used in the context of the full-scale invasion of Ukraine (“It is not happening.”/ “We only defend ourselves.”/ “Others are doing the same.”) and why they are misleading. We collected and curated the most prominent pro-Russian narratives based on existing prebunking works of EUvsDisinfo (EUvsDisinfo, 2023), the OECD (OECD, 2022), the media company ARTE (ARTE, n.d.) and in consultation with experts from disinformation monitoring companies. Participants in the control group were instead presented with a text about three strategies for healthy living. The topic was chosen because it is completely unrelated to the war against Ukraine. The control sheet is comparable in length, structure and design to the inoculation sheet. The inoculation sheet is depicted in Figure 1, the control sheet can be found in the supplementary materials.
Figure 1
Inoculation Sheet
After the intervention, participants were confronted with pro-Russian disinformation and true information regarding the Russian war in Ukraine. We designed ten fictitious Facebook postings, of which seven were true. Three posts contained one disinformation narrative each that was introduced with the inoculation sheet. For the true postings, we tried to balance ideological framing in the sense that participants with a pro-Ukraine attitude are not more likely to identify correct posts. Participants viewed all ten postings in a randomised order and answered the disinformation recognition and credibility perception items for each posting. Figure 2 shows one disinformation posting and one posting containing true information. All postings can be found in the supplementary materials.
Figure 2
Example posting of pro-Russian disinformation (left) and true information (right)
Measures
The order of scales in the survey was fixed, whereas the order of most items within scales was randomized. Descriptives of all scales included in the survey, as well as intercorrelations per subsample are depicted in Table 2.
Dependent Variables
Disinformation Recognition
Participants were asked to assess the presence or absence of disinformation narratives in ten fictitious Facebook posts. Participants were presented with a response format that included four choices per post (denial of events, denial of aggressor role, blame-shifting, no disinformation narrative), which reflected the disinformation strategies in the inoculation sheet. Correct responses per posting were recoded as 1 and incorrect as 0. A sum score for the three disinformation postings served as an indicator for pro-Russian disinformation recognition.[4]
Disinformation Credibility
Participants were asked to indicate the credibility of each of the ten Facebook posts on a verbally anchored 7-point scale (1 = not credible at all to 7 = very credible). The mean of the ratings of the three disinformation items served as the credibility measure (Cronbach’s α = .71).
Russia’s Responsibility for the War
The belief in the responsibility of Russia for the war was measured using three, self-constructed items (e.g. “Russia is responsible for the war in Ukraine.”) rated on a verbally anchored 6-point scale (1 = strongly disagree to 6 = strongly agree, Cronbach’s α = .84).
Solidarity with Ukraine
Solidarity with Ukraine was measured with five items (e.g., “Germany should continue to supply heavy weapons to Ukraine.”) rated on a verbally anchored 6-point scale (1 = strongly disagree to 6 = strongly agree). The scale was adapted from Bojarskich et al. (under review) with Cronbach’s α = .88.
Other Variables
Media Exposure
We assessed exposure to German and Russian media with two adapted four-item scales of Trebbe (2009). On a 9-point scale ranging from 1 = almost never to 9 = every day, participants had to indicate their media diet for television, radio, newspaper and social media. One scale with four items asked about their media diet regarding German-speaking outlets (e.g. “On how many days per week do you watch German television?”, Cronbach’s α = .59), one with four items about their media diet regarding Russian-speaking outlets (“On how many days per week do you watch Russian television?”, Cronbach’s α = .79).
Recognition of True Information
Participants were confronted with the same items as in “disinformation recognition.” However, this time, a sum score for the seven true postings served as an indicator for recognition of true information.
Credibility of True Information
Participants were asked to indicate the credibility of each of the ten Facebook posts on a verbally anchored 7-point scale (1 = not credible at all to 7 = very credible). The mean of the ratings of the seven true items not containing any disinformation served as the credibility measure (Cronbach’s α = .75).
Identification
Identification with Russia, Ukraine and Europe were assessed via pictorial assessment of self and group overlap (Schubert & Otten, 2002), asking participants to choose one out of seven pictures with two increasingly proximate and overlapping circles that represent the “self” and “Russia” or “Ukraine” or “Europe” (1 = largest distance and no overlap between two circles, 7 = completely overlapping circles).
Attention Check
The attention check was integrated into the Russian media exposure scale. Participants were prompted to choose the option “almost never” and were excluded from finishing the survey if they clicked on another response or chose none.
Demographics
We assessed age, sex, education status and political orientation (1 = very left-leaning, 11 = very right-leaning), migration background (none, Russian, Ukrainian, other) as well as migration generation (first, second, third) each with a single item question.
Procedure
After assessing demographic information, participants from both samples were randomly allocated to the experimental or control condition and viewed the informational sheet (intervention: inoculation or control). Then, participants were presented with fictitious Facebook posts containing true and false information about Ukraine, Russia and the war. We asked participants to indicate whether each post contained one of the taught disinformation narratives (disinformation recognition) and if they believed the post to be accurate or not (disinformation credibility). After evaluating the ten posts, we asked participants about their perception of Russia being responsible for starting the war and their feelings of solidarity with Ukraine. As control variables, we then assessed exposure to Russian and German media as well as identification with Europe, Russia and Ukraine. In the debriefing, participants in the control condition were presented with the inoculation sheet.
Preregistered Analysis Plan and Power Analysis
In order to test our eight preregistered hypotheses, we performed four mixed ANOVAs, one for each outcome variable (disinformation recognition, disinformation credibility, Russia’s responsibility for the war and solidarity with Ukraine) with sample condition as one experimental between-factor (non-Russian vs. Russian migration background) and intervention condition as the other experimental between-factor (inoculation vs. control). As a requirement to perform ANOVAs, we tested for normal distribution and variance homogeneity. The variance ratio criteria indicated no evidence of a violation of the homogeneity of variances (all ratios below two). Distributions indicated that residuals from the ANOVAs do not follow a normal distribution. Because F-tests proved to be robust against violations of the normal distribution, we continued with our planned analysis (Blanca et al., 2017). Sample size calculation used a priori simulation-based power analysis (Lakens & Caldwell, 2021) for a 2 x 2 between-subjects ANOVA, targeting the hypothesized main effects (d = .39). The chosen effect size aligns with the lower end of a 95% confidence interval for the average effectiveness of inoculation interventions reported in a meta-analysis (Banas & Rains, 2010). With an anticipated power of approximately .95, a total sample size of N = 600 (n = 150 per condition) was determined.
Table 2
Means, standard deviations and intercorrelations of all analyzed scales for both subsamples
M | SD | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | |
1. C | 0.54 | 0.50 | .28** | .04 | -.15** | .07 | .07 | .01 | -.09 | -.11* | -.07 | -.08 | -.02 | .02 | .02 | |
0.55 | 0.50 | [.17, .38] | [-.07, .15] | [-.26, -.04] | [-.04, .18] | [-.05, .18] | [-.10, .13] | [-.20, .03] | [-.22, -.00] | [-.19, .04] | [-.19, .03] | [-.13, .10] | [-.09, .14] | [-.09, .13] | ||
2. DR | 1.60 | 1.11 | .30** | .08 | -.43** | .19** | .45** | .38** | -.19** | -.27** | -.09 | -.09 | .10 | -.07 | -.08 | |
1.38 | 1.11 | [.19, .40] | [-.03, .19] | [-.52, -.33] | [.08, .30] | [.35, .54] | [.27, .47] | [-.30, -.08] | [-.37, -.16] | [-.20, .02] | [-.20, .02] | [-.02, .21] | [-.18, .04] | [-.19, .03] | ||
3. RT | 4.06 | 2.13 | .02 | -.02 | -.00 | .36** | .09 | .03 | -.01 | -.06 | .06 | -.03 | .20** | .06 | .14* | |
4.38 | 2.11 | [-.10, .13] | [-.13, .10] | [-.12, .11] | [.26, .45] | [-.02, .20] | [-.08, .14] | [-.13, .10] | [-.18, .05] | [-.05, .17] | [-.15, .08] | [.09, .31] | [-.06, .17] | [.03, .25] | ||
4. DC | 3.59 | 1.62 | -.18** | -.27** | -.07 | .12* | -.54** | -.56** | .25** | .19** | .03 | .13* | -.13* | .14* | .00 | |
4.13 | 1.47 | [-.29, -.07] | [-.37, -.16] | [-.19, .04] | [.01, .23] | [-.62, -.46] | [-.63, -.48] | [.14, .35] | [.08, .30] | [-.08, .14] | [.01, .24] | [-.24, -.02] | [.03, .25] | [-.11, .12] | ||
5. CT | 4.53 | 1.23 | -.03 | .15** | .26** | .38** | .30** | .16** | -.13* | -.03 | .04 | -.07 | .03 | .16** | .16** | |
4.48 | 1.04 | [-.14, .09] | [.04, .26] | [.15, .36] | [.28, .48] | [.19, .40] | [.05, .27] | [-.24, -.02] | [-.14, .08] | [-.08, .15] | [-.18, .04] | [-.08, .14] | [.05, .27] | [.05, .27] | ||
6. RR | 4.58 | 1.36 | .17** | .33** | .10 | -.38** | .18** | .78** | -.40** | -.25** | -.01 | -.18** | .16** | -.06 | -.02 | |
3.44 | 1.55 | [.06, .28] | [.23, .43] | [-.01, .21] | [-.48, -.28] | [.07, .29] | [.73, .82] | [-.49, -.30] | [-.35, -.14] | [-.13, .10] | [-.29, -.07] | [.05, .27] | [-.17, .06] | [-.14, .09] | ||
7. SU | 3.89 | 1.45 | .15** | .29** | .02 | -.38** | .14* | .72** | -.28** | -.16** | -.02 | -.18** | .17** | -.06 | .02 | |
2.69 | 1.43 | [.04, .26] | [.18, .39] | [-.10, .13] | [-.47, -.27] | [.03, .25] | [.65, .77] | [-.39, -.17] | [-.27, -.05] | [-.13, .10] | [-.29, -.07] | [.06, .28] | [-.17, .05] | [-.10, .13] | ||
8. IR | 1.93 | 1.68 | -.12* | -.06 | -.02 | .23** | .07 | -.36** | -.37** | .39** | -.09 | .07 | -.05 | .08 | -.08 | |
3.54 | 2.04 | [-.23, -.00] | [-.18, .06] | [-.14, .10] | [.12, .34] | [-.04, .19] | [-.45, -.25] | [-.47, -.26] | [.29, .49] | [-.21, .02] | [-.05, .18] | [-.16, .06] | [-.04, .19] | [-.20, .03] | ||
9. RM | 1.26 | 0.99 | -.02 | -.11 | -.11 | .17** | .03 | -.12* | -.07 | .17** | .11 | .10 | -.03 | .05 | .06 | |
2.08 | 1.58 | [-.13, .09] | [-.22, .01] | [-.22, .00] | [.06, .28] | [-.08, .15] | [-.23, -.00] | [-.18, .05] | [.05, .28] | [-.00, .22] | [-.01, .21] | [-.14, .09] | [-.06, .17] | [-.05, .17] | ||
10. GM | 6.19 | 2.01 | .13* | .04 | -.06 | .01 | .05 | .15* | .15** | -.05 | -.02 | .08 | .04 | -.10 | .16** | |
5.27 | 2.18 | [.02, .24] | [-.07, .16] | [-.17, .06] | [-.10, .13] | [-.06, .16] | [.03, .26] | [.04, .26] | [-.17, .06] | [-.13, .10] | [-.04, .19] | [-.08, .15] | [-.22, .01] | [.05, .27] | ||
11. P | 5.89 | 2.02 | -.09 | -.18** | -.12* | .26** | .03 | -.16** | -.25** | .09 | .20** | -.01 | -.06 | .08 | .18** | |
5.98 | 1.84 | [-.20, .02] | [-.28, -.06] | [-.23, -.00] | [.15, .37] | [-.09, .14] | [-.27, -.05] | [-.35, -.14] | [-.03, .21] | [.09, .31] | [-.12, .11] | [-.17, .06] | [-.03, .19] | [.07, .29] | ||
12. E | 0.39 | 0.49 | .15* | .25** | .14* | -.09 | .12* | .05 | .07 | .01 | .14* | -.03 | -.07 | -.08 | -.02 | |
0.64 | 0.48 | [.03, .26] | [.13, .35] | [.03, .25] | [-.21, .02] | [.01, .23] | [-.06, .17] | [-.04, .18] | [-.11, .13] | [.03, .25] | [-.15, .08] | [-.18, .05] | [-.20, .03] | [-.13, .09] | ||
13. Sex | 0.48 | 0.50 | .02 | -.06 | -.01 | .07 | .08 | .05 | .10 | .01 | .02 | .06 | .01 | .06 | .16** | |
0.30 | 0.46 | [-.10, .13] | [-.17, .06] | [-.12, .10] | [-.04, .19] | [-.03, .20] | [-.06, .16] | [-.01, .22] | [-.11, .13] | [-.09, .14] | [-.06, .17] | [-.10, .12] | [-.05, .17] | [.05, .27] | ||
14. Age | 49.68 | 17.03 | .07 | .02 | .18** | -.05 | .08 | .16** | .15* | .05 | -.19** | .15** | -.02 | .02 | .03 | |
36.41 | 12.07 | [-.04, .19] | [-.10, .13] | [.06, .28] | [-.17, .06] | [-.04, .19] | [.05, .27] | [.03, .26] | [-.07, .16] | [-.30, -.08] | [.04, .26] | [-.14, .09] | [-.09, .14] | [-.08, .14] |
Results
The results of the preregistered hypotheses are depicted in Figure 3.
Figure 3
Influence of Condition (Inoculation vs. Control) and Identity (Russian Migration Background vs. no Russian Migration Background) on a) Disinformation Recognition, b) Disinformation Credibility, c) Russia’s Responsibility for the War, and d) Solidarity with Ukraine.
Test of Social Identity Hypotheses
We found significant associations of the sample condition with all four outcome measures. Participants with a Russian migration background (M = 1.38, SD = 1.11) were less skilled in recognizing pro-Russian disinformation than participants without (M = 1.60, SD = 1.11), F(1, 593) = 6.33, p =.012, η² = .011. Participants with a Russian migration background (M = 4.13, SD = 1.47) deemed pro-Russian disinformation as significantly more credible compared to participants without a Russian migration background (M = 3.59, SD = 1.62), F(1, 588) = 19.22, p < .001, η² = .032. Participants with a Russian migration background (M = 3.44, SD = 1.55) showed also lower perceptions of Russia being responsible for the war than participants without a Russian migration background (M = 4.58, SD = 1.36), F(1, 588) = 91.68, p < .001, η² = .135. And participants with a Russian migration background (M = 2.69, SD = 1.43) indicated less solidarity with Ukraine than participants without a Russian migration background (M = 3.89, SD = 1.45), F(1, 587) = 103.36, p < .001, η² = .150.
Test of Inoculation Hypotheses
Inoculation (intervention condition) significantly affected all four outcome measures. Inoculated participants (M = 1.78, SD = 1.07) were better at recognizing pro-Russian disinformation than participants in the control condition (M = 1.14, SD = 1.06) confirming H5; (F(1, 593) = 53.22, p < .001, η² = .082, Cohen’s d = 0.60). Inoculated participants (M = 3.63, SD = 1.59) perceived pro-Russian disinformation as less credible than not-inoculated participants (M = 4.15, SD = 1.50) confirming H6 (F(1, 588) = 17.08, p < .001, η² = .028, d = 0.34). Inoculated participants (M = 4.15, SD = 1.58) had higher perceptions of Russia’s responsibility for the war than participants in the control group (M = 3.82, SD = 1.53), confirming H7 (F(1, 588) = 7.81, p = .005, η² = .013, d = 0.21). And finally, participants in the inoculation condition (M = 3.39, SD = 1.61) showed more solidarity with Ukraine than participants in the control condition (M = 3.16, SD = 1.49) confirming H8 (F(1, 587) = 4.00, p = .046, η² = .007, d = 0.15).
Identity-as-Moderator Research Question
There was no significant interaction effect between the sample conditions and the intervention conditions on disinformation recognition (F(1, 593) = 0.05, p = .831, η² < .001), disinformation credibility (F(1, 588) = 0.27, p = .607, η² < .001), responsibility of Russia (F(1, 588) = 1.20, p = .275, η² = .002) and solidarity with Ukraine (F(1, 587) = 2.92, p = .088, η² = .005).
Robustness Checks
Recognition of True Information
To make sure that the intervention especially targeted disinformation instead of making participants overall more sceptical, we ran an ANOVA with (a) intervention conditions and (b) sample conditions as independent variables and recognition of true information as the dependent variable. Recognition of true information was not significantly affected by the intervention (F(1, 593) = 0.56, p = .456, η² < .001).
Credibility of True Information
We repeated this robustness check for the credibility of true information scale and ran an ANOVA with (a) intervention conditions and (b) sample conditions as independent variables and credibility of true information as the dependent variable. The credibility of true information was not significantly affected by the intervention (F(1, 593) = 0.22, p = .638, η² < .001).
Replacing Migration Background with Identification with Russia
As an additional robustness check, we replaced the dichotomous migration background with a continuous variable measuring identification with Russia and investigated H1-H8 with four moderated regressions, with condition and the z-standardized identification with Russia as predictors and disinformation recognition, disinformation credibility, Russia’s responsibility for the war and solidarity with Ukraine respectively as dependent variables. Replacing migration background with identification with Russia did change the results only slightly in the sense that identification with Russia is not a significant predictor of disinformation recognition, but of disinformation credibility, Russia’s responsibility for the war and solidarity with Ukraine. A table with the moderated regression effects can be found in the supplementary materials.
Controlling for Demographics
Because our subsamples significantly diverged in terms of age, sex (male, female), and education (low, middle, high), we ran ANCOVAs to test our preregistered hypotheses, controlling for these variables. The patterns of results only changed slightly in the sense that all effects could be replicated but H8. Controlling for age, sex and education, there was no significant effect of the intervention on solidarity with Ukraine (F(1, 574) = 2.19, p = .139, η² = .004). We also controlled for the time of migration from Russia to Germany (1 = one generation, 0 = more generations). Migration generation did not significantly affect the DVs. The F-statistics for all other hypotheses can be found in the supplementary materials.
Further Exploratory Analyses: The Influence of Russian Media
We further examined Russian media exposure as an influencing factor and a potential moderator of the effects of the experimental manipulation on the dependent variables. To ensure that media exposure was not affected by the intervention conditions, we performed two t-tests for independent samples. We could not find a significant difference between the inoculation and control conditions for Russian media exposure t(551) = 1.70, p = .091 and German media exposure t(568) = .52, p = .605. We then conducted moderated regression analyses for each outcome variable with the intervention conditions, mean-centered Russian media exposure and the interaction of intervention conditions and Russian media exposure. We used German media exposure as a control variable (see Table 3).
The higher the exposure to Russian media, the less likely were participants to spot Russian disinformation, the higher were disinformation credibility perceptions, the lower were perceptions of Russia’s responsibility for the war, and solidarity with Ukraine. We found no significant interaction effects between Russian media exposure and the intervention conditions on disinformation recognition, perceived credibility of pro-Russian disinformation, Russia’s responsibility for the war, or solidarity with Ukraine. Controlling for German media consumption, these patterns did not change. However, German media consumption itself became a significant predictor for perceptions of Russian responsibility and solidarity with Ukraine. A figure illustrating the moderation effect of Russian media exposure on the intervention for all DVs can be found in the supplementary materials.
Table 3 Moderated Regression Analysis | ||||||||||||||||
Disinformation recognition | Disinformation credibility | Russia’s responsibility | Solidarity with Ukraine | |||||||||||||
b [95% CI] | SE | β | p | b [95% CI] | SE | β | p | b [95% CI] | SE | β | p | b [95% CI] | SE | β | p | |
Inoculation | .59 [.42, .76] | .09 | .53 | <.001 | -.46 [-.70, -.21] | .13 | -.29 | <.001 | .24 [.00, .49] | .12 | .16 | .050 | .17 [-.08, .41] | .13 | .11 | .181 |
RU media exposure | -.14 [-.23, -.05] | .04 | -.17 | .002 | .15 [ .03, .28] | .06 | .14 | .016 | -0.31 [-.44,-.19] | .06 | -.28 | <.001 | -.25 [-.37, -.12] | .06 | -.22 | <.001 |
GER media exposure | -.01 [-.04, .03] | .02 | -.01 | .779 | -.01 [-.07, .05] | .03 | -.01 | .728 | .09 [ .04, .15] | .03 | .13 | .001 | .10 [.04, .16] | .03 | .13 | <.001 |
I & RUme | -.05 [-.18, .07] | .06 | -.07 | .390 | .17 [-.01, .35] | .09 | -.15 | .065 | -.01 [-.19, .16] | .09 | -.01 | .878 | -.02 [-.20, .16] | .09 | -.02 | .849 |
F | F(4, 587) = 20.36, p < .001 | F(4, 582) = 11.71, p < .001 | F(4, 582) = 17.15, p < .001 | F(4, 581) = 11.59, p < .001 | ||||||||||||
Adj. R² | .12 | .07 | .10 | .07 |
Discussion
Identity is key, but inoculation can help. The first part of this conclusion reflects our finding that Russian social identity could be identified as a potential risk factor for individuals’ susceptibility to Russian disinformation. Participants with a Russian migration background were less skilled in recognizing Russian disinformation, more likely to perceive Russian disinformation as credible, believed Russia to be less responsible for the war and showed less support for Ukraine compared to participants without a Russian migration background (H1 – H4). These findings were robust even when we (a) replaced our sample condition with the self-reported identification with Russia and (b) controlled for age, gender, education and migration generation. We interpret this as robust evidence that social identity can be a relevant predictor of the susceptibility to group-based disinformation or political propaganda. This finding aligns well with Erlich and Garner (2023) showing that ethno-linguistic ties to Russia heighten susceptibility to pro-Russian disinformation.
Additionally, our exploratory analyses showed significant associations of Russian media exposure with disinformation perceptions and political attitudes. The more participants are exposed to Russian media, the less skilled they are in recognizing pro-Russian disinformation, the more they perceive pro-Russian disinformation as credible, believe Russia’s responsibility for the war to be lower and show less solidarity with Ukraine. These associations remained significant when controlling for German media exposure as a proxy for the general amount of media exposure.
But inoculation does its job. Our study shows that inoculation is an effective tool against pro-Russian disinformation that works similarly for groups with different degrees of vulnerability. Inoculating participants against typical pro-Russian disinformation narratives about the war in Ukraine, (a) improves their skills to detect disinformation, (b) lowers credibility perceptions of pro-Russian disinformation, (c) heightens perceptions of Russia as being responsible for the war against Ukraine and (d) even increases reported feelings of solidarity with Ukraine. The intervention does not make participants statistically significantly more sceptical of true information, not in terms of false recognition of true information as disinformation and not in terms of credibility perceptions. Finally, we find no evidence that the effect of the inoculation is significantly impaired by the participants’ identity. We interpret the lack of this interaction effect as evidence that inoculation can even help to reduce disinformation susceptibility when people have a motivational affinity to believe the information. This result underlines the finding by Traberg et al. (2022) showing that inoculation works equally well for vulnerable and non-vulnerable groups. By introducing a text-based, strategic inoculation approach with a visually appealing design, we extend the existing implementations of various inoculation procedures (e.g., Basol et al., 2020; Biddlestone et al., 2023).
Limitations
Our robustness checks showed that some of the inoculation hypotheses became statistically non-significant when controlling for demographics or replacing sample conditions with identification with Russia. Using identification with Russia as the experimental factor, the intervention effect of the inoculation treatment became statistically non-significant for Russia’s responsibility for the war and solidarity with Ukraine. Using sample conditions as an experimental factor, but controlling for age, sex and gender, there was no significant intervention effect for solidarity with Ukraine. These findings might indicate that, while inoculation consistently affects recognition and credibility perceptions, its potential to change more deeply rooted attitudes is limited, at least in a short-term intervention.
Another limiting factor to our results can be seen in the non-representative Russian migration background sample, which our panel company could only provide. For example, these participants are on average seven years younger than the non-Russian sample. While Germans of Russian descent identified significantly stronger with Russia than Germans without Russian descent, we could not detect significant differences between both samples for identification with Ukraine and Europe. We therefore assume that the effects of identity might be even more pronounced for Germans of Russian descent, who do identify significantly less with Ukraine and Europe. In a future study, we would also like to assess more demographic information on the subsamples such as religious background or socio-economic status to better capture the heterogeneity of this group.
Also, we did not yield the effect sizes as calculated in our power analysis for all of our dependent measures (d = .39). Given the fact that our inoculation approach was novel and that the effect sizes of our inoculation treatment (Cohen’s d = 0.15 – 0.60) are in line with the effect sizes of inoculation treatments in the meta-analysis by Lu et al. (2023), we do not believe this to be a severe limitation. The effect sizes in our study can help to make future power calculations of similar inoculation approaches more accurate.
Another limiting factor could be that our intervention has been perceived as ideologically skewed by some participants, although this potential reactance does not show in the overall results. Six participants commented on this matter at the end of the survey (e.g. Participant One: “It is nice that you want to fight propaganda with this survey or show how to recognize it, but if you use propaganda yourself in the survey, you are no better than the other side”). As a more general limitation, the transferability of our results to real-world settings, where recipients are neither financially compensated for engaging with our material nor prevented from skipping the inoculation intervention for 60 seconds is at least questionable. Therefore, it is imperative to conduct inoculation studies without paid samples in the future and examine other incentives that might motivate engagement. Finally, we measured immediate short-term inoculation effects only. A replication of our study should also test the longevity of the effects (e.g. see Capewell et al., 2023; Maertens et al., 2021).
Implications
Inoculation proved to be a powerful tool against disinformation in many contexts. With this study, we could show that it is also helpful in protecting audiences with different identity ties to Russia from pro-Russian disinformation about the war in Ukraine. Thus, we recommend stakeholders in public communication, education specialists, journalists and governments to invest more efforts and means into inoculation against pro-Russian disinformation – not only for the German non-Russian public but also for Germans with a Russian migration background. We also believe that a strength of our approach lies in its scalability. By combining a text-based, strategic inoculation with an appealing design and layout, we might heighten the motivations of potential recipients to engage with this material without compensatory payment. Such inoculation material could easily be published online and offline e.g. via billboard campaigns. Our findings also show that having a Russian migration background is associated with higher susceptibility to pro-Russian disinformation. Future research could aim to design tailored interventions for these vulnerable individuals by, inter alia, acknowledging migrants’ challenges of holding a dual identity in this context or by supporting community leaders with a Russian background in Germany who offer a nuanced, non-Kremlin perspective on the war. Our findings further show that Russian media exposure is also associated with higher susceptibility to Russian disinformation. Thus, further identifying popular and available sources of Russian disinformation in Germany and restricting their outreach could be a complementary measure to inoculation. However, it might also be important to fill the gap in Russian quality journalism and leverage Russian-speaking media outlets that do not contain pro-Kremlin disinformation, such as Nowaja Gazeta Europe (https://novayagazeta.eu/). Through targeted inoculation strategies and community-based interventions, we can pave the way for a more informed and resilient society.
Reproducibility Statement
The data, the R anaylsis script and the codebook can be found in the supplementary materials (https://osf.io/e7sqb/?view_only=07de35a7cdec4272ac9b9b99cff4988f).
Conflicts of Interest
The authors declare no competing interests.
Acknowledgements
We express our gratitude to Viktoria Stojan, Jan-Niklas Bauscher, and Luca-Philipp Grumbach for their contributions to the material development and manuscript preparation. Additionally, we extend our thanks to Andriy Kusyy (LetsData) for providing consultancy on disinformation narratives, to Iliane Kiefer (Zentrum Liberale Moderne) for sharing her knowledge on Germans of Russian descent, and to Jon Roozenbeek for discussing ideas on the intervention design.
We acknowledge support by the German Research Foundation project number 512648189 and the Open Access Publication Fund of the Thueringer Universitaets- und Landesbibliothek Jena.
Endnotes
[1] Russian Germans is a common term to describe people from Russia, but also Belarus, Ukraine, Kazakhstan and other Post-Soviet countries with a German heritage that immigrated to Germany.
[2] Compared to the preregistration (https://aspredicted.org/FLT_HKL) the order of the hypotheses one to eight was changed. Also, we changed the terms Russian-Germans to Germans of Russian descent and German sample to Germans without Russian descent to describe our subsamples most accurately, as well as steps of propaganda to aspects of propaganda.
[3] Our panel provider was not able to offer a quota-based sample for Germans with a Russian migration background.
[4] We slightly deviated from the preregistration. Instead of calculating the recognition sum score which combines true and false information, we followed the editors and reviewers suggestion to analyze recognition for disinformation and true information separately.
References
ARTE. (n.d.). Fake news – Kultur und Pop. https://www.arte.tv/de/videos/RC-022858/fake-news
Atlantic Council. (2023a). Undermining Ukraine. How the Kremlin employs information operations to erode global confidence in Ukraine. https://www.atlanticcouncil.org/wp-content/uploads/2023/02/Undermining-Ukraine-Final.pdf
Atlantic Council. (2023b). Narrative warfare. How the Kremlin and Russian news outlets justified a war of aggression against Ukraine. https://www.atlanticcouncil.org/wp- content/uploads/2023/02/Narrative-Warfare-Final.pdf
Banas, J. A., & Rains, S. A. (2010). A meta-analysis of research on inoculation theory. Communication Monographs, 77(3), 281–311. https://doi.org/10.1080/03637751003758193
Basol, M., Roozenbeek, J., & van der Linden, S. (2020). Good news about bad news: Gamified inoculation boosts confidence and cognitive immunity against fake news. Journal of Cognition, 3(1), 2, 1-9. https://doi.org/10.5334/joc.91
Van Bavel, J. J., & Pereira, A. (2018). The partisan brain: An identity-based model of political belief. Trends in Cognitive Sciences, 22(3),213–224. https://doi.org/10.1016/j.tics.2018.01.004
BBC. (2023, February 1). Russia in Africa: How disinformation operations target the continent. https://www.bbc.com/news/world-africa-64451376
Biddlestone, M., Roozenbeek, J., & van der Linden, S. (2023). Once (but not twice) upon a time: Narrative inoculation against conjunction errors indirectly reduces conspiracy beliefs and improves truth discernment. Applied Cognitive Psychology, 37(2), 304–318. https://doi.org/10.1002/acp.4025
Blanca, M. J., Alarcón, R., Arnau, J., Bono, R., & Bendayan, R. (2017). Non-normal data: Is ANOVA still a valid option? Psicothema, 29(4), 52–557. https://doi.org/10.7334/psicothema2016.383
Bojarskich, V., Grosche, C., Ziemer, C.-T., & Rothmund, T. (under review). Solidarity with Ukraine? How fear- and moral-based motivations influence the psychological tug of war in the German public.
Bundeszentrale für politische Bildung. (2022, January 1). (Spät-)Aussiedler. https://www.bpb.de/kurz-knapp/zahlen-und-fakten/soziale-situation-in-deutschland/61643/spaet-aussiedler/
Capewell, G., Maertens, R., can der Linden, S., & Roozenbeek, J. (2023). Misinformation interventions decay rapidly without an immediate post-test. Pre-print. https://doi.org/10.31234/osf.io/93ujx
Decker, P. (2020). “We show what is concealed”: Russian soft power in Germany. Problems of Post-Communism, 68(3), 216–230. https://doi.org/10.1080/10758216.2020.1753082
Destatis. (2023). Bevölkerung in Privathaushalten nach Migrationshintergrund im weiteren Sinn nach ausgewählten Geburtsstaaten. https://www.destatis.de/DE/Themen/Gesellschaft-Umwelt/Bevoelkerung/Migration-Integration/Tabellen/migrationshintergrund-staatsangehoerigkeit-staaten.html
Eady, G., Paskhalis, T., Zilinsky, J., Bonneau, R., Nagler, J., & Tucker, J. A. (2023). Exposure to the Russian internet research agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior. Nature Communications, 14(1), 62. https://doi.org/10.1038/s41467-022-35576-9
Erlich, A., & Garner, C. (2023). Is pro-Kremlin disinformation effective? Evidence from Ukraine. The International Journal of Press/Politics, 28(1), 5–28. https://doi.org/10.1177/19401612211045221
EUvsDisinfo. (2023, August 10). Deny, deflect, distract, confuse. Repeat. https://euvsdisinfo.eu/deny-deflect-distract-confuse-repeat/
Hanlon, B. (2018, April 1). It’s not just Facebook: Countering Russia’s social media offensive. German Marshall Fund of the United States. http://www.jstor.org/stable/resrep18880
Harjani, T., Roozenbeek, J., Roozenbeek, J., & Van Der Linden, S. (2023). Gamified inoculation against misinformation in India: A randomized control trial. Journal of Trial and Error, 3(1), 14–56. https://doi.org/10.36850/e12
Infratest Dimap (2023). ARD-DeutschlandTREND. https://www.infratest-dimap.de/fileadmin/user_upload/DT_2302_Bericht.pdf
Jigsaw. (2023). Defanging disinformation’s threat to Ukrainian refugees. https://medium.com/jigsaw/defanging-disinformations-threat-to-ukrainian-refugees-b164dbbc1c60
Kahan, D. M., Peters, E., Dawson, E., & Slovic, P. (2017). Motivated numeracy and enlightened self-government. Behavioural Public Policy, 1(1), 54–86. https://doi.org/10.1017/bpp.2016.2
Kiefer, L., Mangold, P., & Prokopkin, S. (2021, March 24th). Wie Rechtspopulisten versuchen, russlanddeutsche (Spät-)Aussiedler/innen in sozialen Medien für ihre Sache zu gewinnen. Zentrum liberale Moderne. https://libmod.de/rechtspopulisten_versuchen_russlanddeutsche_soziale_medien_gewinnen/
Lakens, D., & Caldwell, A. R. (2021). Simulation-based power analysis for factorial analysis of variance designs. Advances in Methods and Practices in Psychological Science, 4(1). https://doi.org/10.1177/2515245920951503
Lewandowsky, S., & van der Linden, S. (2021). Countering Misinformation and Fake News Through Inoculation and Prebunking. European Review of Social Psychology, 1–38. https://doi.org/10.1080/10463283.2021.1876983
Lu, C., Hu, B., Li, Q., Bi, C., & Ju, X.-D. (2023). Psychological inoculation for credibility assessment, sharing intention, and discernment of misinformation: Systematic review and meta-analysis. Journal of Medical Internet Research, 25, e49255. https://doi.org/10.2196/49255
Maertens, R., Roozenbeek, J., Basol, M., & van der Linden, S. (2021). Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments. Journal of Experimental Psychology: Applied, 27(1), 1-16. https://doi.org/10.1037/xap0000315
McGuire, W. J. (1964). Some contemporary approaches. In L. Berkowitz (Ed.), Advances in experimental social psychology (pp. 191–229). https://doi.org/10.1016/s0065-2601(08)60052-0
Meister, S. (2016, July 25). The “Lisa case”: Germany as a target of Russian disinformation. NATO Review https://www.nato.int/docu/review/articles/2016/07/25/the-lisa-case-germany-as-a- target-of-russian-disinformation/index.html
OECD. (2022, November 3). Disinformation and Russia’s war of aggression against Ukraine: Threats and governance responses. https://www.oecd.org/ukraine-hub/policy-responses/disinformation-and-russia-s-war-of-aggression-against-ukraine-37186bde/#biblio-d1e2201
Panagiotidis, J. (2017a). Postsowjetische Migranten in Deutschland. Perspektiven auf eine heterogene „Diaspora“. Aus Politik und Zeitgeschichte, 67(11-12/2017). Bonn.
Panagiotidis, J. (2017b). Wer sind die Russlanddeutschen? Bundeszentrale für politische Bildung. https://www.bpb.de/themen/migration-integration/kurzdossiers/252535/wer-sind-die-russlanddeutschen/#node-content-title-1
Paul, C., & Matthews, M. (2016). The Russian “firehose of falsehood” propaganda Model. RAND Cooperation. https://doi.org/10.7249/PE198
Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402. https://doi.org/10.1016/j.tics.2021.02.007
Pereira, A., Harris, E., & Van Bavel, J. J. (2021). Identity concerns drive belief: The impact of partisan identity on the belief and dissemination of true and false news. Group Processes & Intergroup Relations, 26(1), 24-47, https://doi.org/10.1177/13684302211030004
Pierri, F., Luceri, L., Jindal, N., & Ferrara, E. (2023). Propaganda and misinformation on Facebook and Twitter during the Russian invasion of Ukraine. WebSci ’23: Proceedings of the 15th ACM Web Science Conference 2023. https://doi.org/10.1145/3578503.3583597
Rid, T. (2021). Active measures. The secret history of disinformation and political warfare. International Affairs, 97(1), 244-245. https://doi.org/10.1093/ia/iiaa211
Roozenbeek, J., van der Linden, S., Goldberg, B., Rathje, S., & Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Science Advances, 8(34), eabo6254. https://doi.org/10.1126/sciadv.abo6254
Roozenbeek, J., & Van der Linden, S. (2024). The psychology of misinformation. Cambridge University Press. https://www.cambridge.org/core/books/psychology-of- misinformation/2FF48C2E201E138959A7CF0D01F22D84/listing
Ryzhova, A. (2024). Motivated by political beliefs, not only by language: How Russian speakers in Germany compose their transnational news repertoires. Journalism, 25(1), 218-237. https://doi.org/10.1177/14648849221130557
Schmid, P., & Betsch, C. (2019). Effective strategies for rebutting science denialism in public discussions. Nature Human Behaviour, 3(9), 931-939. https://doi.org/10.1038/s41562-019-0632-4
Schubert, T. W., & Otten, S. (2002). Overlap of Self, Ingroup, and Outgroup: Pictorial Measures of Self-Categorization. Self and Identity, 1(4), 353–376. https://doi.org/10.1080/152988602760328012
Simon, B., Reichert, F., & Grabow, O. (2013). When dual identity becomes a liability: identity and political radicalism among migrants. Psychological Science, 24(3), 251–257. https://doi.org/10.1177/0956797612450889
Tajfel, H., & Turner, J. (1986). The social identity theory of intergroup behavior. In S. Worchel, & W. Austin (Eds.), Psychology of intergroup relations (Vol. 2, pp. 7–24). Nelson-Hall Publishers. https://doi.org/10.4324/9780203505984-16
Traberg, C. S., Roozenbeek, J., & van der Linden, S. (2022). Psychological inoculation against misinformation: Current evidence and future directions. The ANNALS of the American Academy of Political and Social Science, 700(1), 136-151. https://doi.org/10.1177/00027162221087936
Trebbe, J. (2009). Mediennutzung. In J. Trebbe (Ed.), Ethnische Minderheiten, Massenmedien und Integration (pp. 207 – 210). VS Verlag für Sozialwissenschaften. https://doi.org/10.1007/978-3-531-91696-5
Vojta, S. (2024, January 26). Auswärtiges Amt deckt russische Desinformationskampagne auf. ZEIT ONLINE https://www.zeit.de/politik/2024-01/russland-desinformationskampagne- auswaertiges-amt
Volkov, D., Kolesnikov, A. (2023, November). Alternate reality: How Russian society learned to stop worrying about the war. https://carnegieendowment.org/files/Kolesnikov_Volkov_Russians_and_Wars5.pdf
Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Council of Europe. https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html
Washburn, A. N., & Skitka, L. J. (2018). Science denial across the political divide: liberals and conservatives are similarly motivated to deny Attitude-Inconsistent science. Social Psychological and Personality Science, 9(8), 972–980. https://doi.org/10.1177/1948550617731500
WDR. (2024, February 22nd). Russischsprachige Menschen – wie sie sind und wie sie leben. https://www1.wdr.de/nachrichten/russlanddeutsche-russland-deutschland-100.html
Yablokov, I. (2022). Russian disinformation finds fertile ground in the West. Nature Human Behaviour, 6(6), 766–767. https://doi.org/10.1038/s41562-022-01399-3
Zerback, T., Töpfl, F., & Knöpfle, M. (2021). The disconcerting potential of online disinformation: Persuasive effects of astroturfing comments and three strategies for inoculation against them. New Media & Society, 23(5). 1080–1098. https://doi.org/10.1177/1461444820908530
Ziemer, C.-T., & Rothmund, T. (2024). Psychological underpinnings of misinformation countermeasures. Journal of Media Psychology. https://doi.org/10.1027/1864-1105/a000407