Page:What Are Conspiracy Theories? A Definitional Approach to Their Correlates, Consequences, and Communication.pdf/18

 In general, studies of conspiracy belief have not sought to differentiate conspiracy theories from other beliefs in this systematic or controlled way. In correlational research, for example, it is rare that correlations between predictor variables and conspiracy beliefs adjust for other (e.g., mainstream or official) beliefs. To our knowledge, not one study has systematically manipulated the presence versus absence of essential features of conspiracy theories. Nor have studies sought to identify which of the key features of conspiracy theories are the active ingredients, or mediators, of their functional differences from other beliefs—for example, whether the malevolence ascribed by conspiracy theories explains why they are more likely than other explanations for social phenomena to foment intergroup hostility. This kind of work is crucial in identifying the unique social-psychological functions of conspiracy theories, and it needs to be informed by an explicit account of their essential causal properties.

A definitional approach can capitalize on many of the essential features of conspiracy theories—for example, their malevolence and epistemic riskiness—being conceivable as continuous but also as binary variables. That is, conspiracy theories can vary from each other by degree according to how strong each of these features is. By measuring or manipulating these variations, we can therefore assess the role of the essential ontological features of conspiracy theories in determining how these beliefs come to be adopted and shared and affect people’s attitudes and behavior. In this sense, we can say that the definitional approach contains a general theory that the causal properties of conspiracy theories depend on their basic ontological properties, and from this general theory more specific theories can be generated and tested.

To illustrate, malevolence can vary by degree—for example, in the scale of the harm it is intended to cause. The more malevolent this plot is, the more the acceptance of the conspiracy theory is likely to be dependent on individual differences associated with paranoia and misanthropy, and the more the theory is likely to be shared by speakers who want to create a heightened sense of threat from an outgroup—for example, to normalize prejudice, exploit xenophobia for personal political gain, or galvanize support for intergroup violence (e.g., Kofta et al. 2020). Further, the more malevolent the plot is, the more likely it is to increase perceptions of threat, leading to prejudice, discrimination, and intergroup hostility (e.g., Bilewicz et al. 2013, Jolley et al. 2019). The malevolence of the alleged plot can be manipulated easily, for example, by varying the intensity of the harm being sought by the plotters (e.g., from subtle mind control or moderate disease to infertility and death) and the scale of that harm (e.g., affecting a small community or an entire national or ethnic population).

Another illustrative example of this advantage is in examining the epistemic riskiness of conspiracy theories. The riskier a conspiracy theory is, the more its acceptance should be associated with variables connected to the endorsement of empirically risky ideas in general. These variables include intuitive thinking, lower cognitive ability, and bullshit receptivity (e.g., van Prooijen et al. 2018). Regarding the sharing of conspiracy theories, human communication dynamics such as audience tuning (e.g., Echterhoff et al. 2008, Higgins & Rholes 1978) suggest that people are more likely to share riskier conspiracy theories when communicating with audiences whom they perceive to be less willing or able to use rational, critical thought to interpret the message (Petrocelli 2018). Further, sharing epistemically riskier conspiracy theories is more likely to be stigmatized (Lantian et al. 2018). A hypothetical consequence of the acceptance of empirically riskier conspiracy theories is that, when endorsed, they are likely to contribute to more radical distrust of officials and rejection of their narratives (Einstein & Glick 2015, Jolley & Douglas 2014a).

The degree of epistemic risk can also be experimentally manipulated. All else being equal, for example, conspiracy theories become more epistemically risky as the number of alleged 288Douglas • Sutton