Home World News Creating Conspiracy Theories: What Information Warriors Need to Know

Creating Conspiracy Theories: What Information Warriors Need to Know

Conspiracy theories play a growing role in modern conflict by shaping how audiences interpret threat, trust, and authority before overt action occurs. This essay examines conspiracy theories as cognitive environments rather than collections of false claims. Drawing on the Existential Threat Model and a political-psychological model of conspiracy belief formation, it explains how such beliefs are cultivated, why counter-messaging often fails, and what strategic risks weaponized conspiracy narratives pose for information warriors. 

Introduction 

During the Algerian War of Independence, French counterinsurgency forces exploited a psychological vulnerability within the ranks of the National Liberation Front (FLN) by creating a conspiracy theory. Through a deception operation known as La Bleuite, the French generated the fear of betrayal and increased risk amongst the Algerian revolutionaries. The conspiracy held that French intelligence had deeply infiltrated the FLN movement. Suspicion  spread through the ranks, causing trust and cohesion to collapse in some cases. The perceived threat was existential. If traitors were everywhere, the movement’s identity and moral authority were at risk. This resulted in purging of the ranks in an effort to sift out traitors. Many otherwise loyal revolutionaries were persecuted and murdered. This weakened the FLN more effectively than direct military action. It succeeded because it leveraged existing fears, redefined uncertainty as hostile intent, and imposed social and operational costs on disbelief.  

For information warriors, the lesson is that conspiracy theories can be weaponized as an information weapon if properly constructed. They emerge when uncertainty is left unresolved, and uncertainty is pervasive in war. Understanding this matters because conspiracy theories can condition how people interpret events before any visible action takes place. They shape what feels plausible. They determine which sources are trusted and which are dismissed. They define enemies in advance and assign moral blame before facts are known. By the time violence, protest or mobilization occurs, the interpretive work has already been created. Actions then feel defensive rather than aggressive. Decisions feel necessary rather than chosen. For information warriors, this means influence is often decided upstream, at the level of belief formation, not at the moment of crisis.  

This essay does not offer tactical guidance on deploying conspiracy narratives; instead, it explains how such belief systems function, why they are effective and what risks they pose when treated as tools of influence in modern conflict. However, it does offer practitioners guidelines for taking direct action. For example, conspiracy theories are not injected into audiences at moments of decision. They are cultivated in advance by shaping how threat, trust, and identity are interpreted over time. This requires keen attention to the operational environment and the existing vulnerabilities ripe for exploitation. Effective influence operates upstream, before claims are stabilized and before sides are fixed. To explain how this works, the discussion now turns to two complementary theoretical frameworks about conspiracy theories. First is the existential threat model (ETM), and second is what can be referred to as the conspiracy belief formation model. Taken together, they show how conspiracy theories can be deliberately cultivated without relying on spectacle, mass deception, or centralized propaganda. 

Existential Threat Model 

ETM specifies five structural elements derived from evolutionary psychology that must be present for a conspiracy theory to form and persist. Together, they explain why certain narratives harden into conspiracy beliefs while others remain ordinary propaganda or rumor. The model has been applied directly to strategic communication analysis and is well-suited for information warfare contexts. In order to demonstrate how the elements of ETM are applied to reality, we will rely on a real world example of a Russian cyberattack on an American technology company, called Solar Winds that morphed into a conspiracy theory. In that incident, the Russians inserted malicious code into a routine software update from Solar Winds, granting covert access to government agencies and private companies for months before detection. The technical complexity of the breach, the delayed discovery and the limited public visibility into attribution created prolonged uncertainty.  As a result, the incident became the basis for conspiratorial interpretations, with routine detection failures, incomplete disclosures and interagency coordination recast as evidence of hidden cooperation, tolerance or deeper infiltration. 

First, a conspiracy theory must present a pattern that appears to causally connect people and events. The pattern does not need to be true; it just needs to feel coherent. Discrete events are arranged to suggest coordination rather than coincidence. In information warfare, this often appears as timelines, repeated “signals,” or selective sequencing that implies design. The pattern gives the audience a sense that chaos has an underlying order. In modern information environments, unresolved cyber incidents often become the raw material for pattern construction. The SolarWinds breach linked routine software updates, multiple government agencies and private firms into a single narrative of coordination. Discrete technical events were interpreted as evidence of an overarching design rather than isolated compromise. 

Second, is agency attribution, where patterns must be driven by some kind of agent acting intentionally. In an information warfare context, this takes the form of exaggerated intent attribution even when intent is absent. Audiences are encouraged to assume that negative outcomes are deliberate rather than accidental. These actors are depicted as highly capable and tightly coordinated. When policies fail or plans break down, those outcomes are treated as intentional signals rather than mistakes. Setbacks are explained as deception or misdirection. Random error disappears from the explanation and is replaced by assumed hostile intent. Cyber intrusions like SolarWinds encourage exaggerated intent attribution. Because the operation appeared sophisticated and persistent, failures in detection or response were reinterpreted as intentional tolerance or hidden cooperation. Mistakes were treated as signals. Delay became evidence of complicity. 

Third, the conspiracy must involve a meaningful threat to the audience or their group. This threat can be physical, economic, cultural, or moral. ETM shows that threat perception is central to belief adoption. In information warfare, chronic and unresolved threats like pervasive cyber-attacks, are more effective than a single large attack. It keeps anxiety active and attention focused. The breach created a chronic and unresolved sense of threat rather than a single crisis moment. Access to government networks, private infrastructure, and sensitive data suggested long-term exposure. This sustained uncertainty aligns with ETM’s finding that persistent, low-resolution threats are more destabilizing than isolated attacks. 

Fourth is the perception of a coalition of conspirators, as conspiracies require more than one actor. ETM specifies that audiences expect alliances working together toward a malevolent goal. These coalitions can be states, corporations, institutions, or social groups. The size of the alliance signals power. The diversity of actors signals reach. Together, they make resistance feel difficult and vigilance necessary. SolarWinds also enabled coalition narratives. Technical complexity and scale made it easy to frame the breach as involving intelligence agencies, corporations, and political actors acting in concert. The breadth of affected institutions reinforced perceptions of coordinated power and reach.  

The final element is secrecy. The conspirators must be attempting to hide their actions. Secrecy explains gaps in evidence and neutralizes counterarguments. Absence of proof becomes proof of concealment. In information warfare, secrecy also discredits official denials in advance. If the truth were visible, the theory argues the conspiracy would have failed already. Finally, the technical opacity of the breach reinforced secrecy assumptions. Limited public attribution, classified assessments, and incomplete disclosure made absence of evidence appear expected rather than reassuring. Secrecy did not weaken conspiratorial interpretation; it sustained it. 

Together, these five elements explain why conspiracy theories are resilient. They impose order, assign blame, elevate threat, and close off alternative interpretations. For information warriors, ETM provides a clear diagnostic framework. If all five elements are present, the narrative is operating as a conspiracy theory. If one is missing, it is not. ETM explains what a conspiracy theory must look like to function as a belief system. However, it does not explain why some audiences adopt these narratives while others do not. That vulnerability is explained by the conspiracy belief formation model, which accounts for the psychological and social conditions that make individuals receptive to conspiratorial explanations.  

Conspiracy Belief Formation Model 

The conspiracy belief formation model emerged from the intersection of social and political psychology. It specifies that five conditions existing in the mind of a person generates conspiracy belief adoption. In information warfare, they define when belief systems are easiest to cultivate. The first condition is anxiety, which is the primary entry point. People become vulnerable when uncertainty feels persistent and unresolved. The anxiety does not need to be extreme, just ongoing. In information warfare, this is often produced by ambiguity around institutions, competence or fairness. Anxiety creates a demand for explanation. Conspiracy beliefs supply it.  

The second condition is social influence because belief adoption is socially mediated. People often look to peers to judge credibility under uncertainty. Endorsement by trusted others reduces perceived risk. In contested environments, messages carried through personal networks carry more weight than statements from authorities. Belief spreads through relationships before it spreads through media. Third is synergy, which refers to the way multiple cues reinforce each other. Anxiety, social validation, and repetition interact. No single signal is decisive. Together, they create momentum. In practice, partial claims, anecdotes, and moral cues combine to produce a coherent belief environment. The whole is stronger than any part. 

The fourth condition is perceived plausibility. Claims must feel plausible within the audience’s existing worldview. They do not need to be proven. They must align with prior grievances, lived experience, or known facts. Plausibility is contextual. In information warfare, this often involves blending true elements with interpretation rather than fabrication. The final condition is unfalsifiability, where conspiracy beliefs are extremely difficult to disprove. Contradictory evidence is reframed as manipulation or concealment. Lack of evidence is treated as proof of secrecy. This makes the belief system resilient. Counter-arguments may strengthen rather than weaken commitment. 

How the Two Models Work Together in Practice 

Taken together, the Existential Threat Model and the conspiracy belief formation model explain both how conspiracy theories are constructed and how they are adopted. ETM operates at the narrative level. It explains how environments are shaped so that the? threat feels persistent, intentional, and patterned. Events are framed as coordinated rather than accidental. Agency is attributed to hostile actors. Coalitions are identified, and secrecy is implied. This narrative structure makes conspiratorial explanations feel reasonable rather than extreme. The conspiracy belief formation model explains how individuals respond to that environment. Anxiety increases under uncertainty. Social cues signal shared concern. Repetition builds familiarity. Plausible elements anchor belief to lived experience. Unfalsifiability prevents disengagement. Belief forms gradually through interaction and reinforcement, not persuasion. Together, the models show that conspiracy theories stabilize when narrative structure aligns with psychological vulnerability. For information warriors, influence is achieved by shaping conditions, not by inserting claims. 

Why Counter-Messaging and Debunking Can Fail 

Counter-messaging against conspiracy theories can fail when it targets specific claims made in a specific theory rather than conditions present in that context. Conspiracy beliefs are sustained by an interpretive environment shaped by threat, identity, and distrust. Disputing individual facts does not dismantle that environment. Counter-messaging could backfire by signaling that contested claims represent something important. Debunking also reintroduces uncertainty. By removing intentional explanations and replacing them with ambiguity, corrective messaging restores the anxiety that conspiracy beliefs were adopted to manage. Debunking further fails because it challenges identity rather than evidence. Once belief is tied to group belonging, disagreement becomes a social threat. Corrective messages signal elite authority and out-group judgment, activating defensiveness rather than reflection. Efforts to disprove claims are then interpreted as proof of secrecy and coordination. Absence of evidence becomes evidence of concealment. In this way, counter-messaging hardens belief instead of weakening it. This does not mean that tactical debunking has no role; in time-sensitive situations, it can be necessary to protect specific systems, personnel, or decision-making processes from immediate harm. 

Loss of Control and Narrative Drift 

Weaponized conspiracy theories are difficult to control once they take hold. They do not remain fixed to their original framing or intent. Because they operate through interpretation rather than instruction, meaning is constantly renegotiated by participants. New actors add explanations. Peripheral claims become central. The narrative expands and absorbs unrelated grievances. Attempts to narrow or redirect the conspiracy theory may fail because they can be interpreted as evidence of manipulation. This drift creates strategic risk. Narratives designed to undermine trust in a specific institution can generalize into distrust of all authority. Theories aimed at foreign adversaries can be repurposed against domestic actors. As beliefs adapt to local identities, reach increases but predictability declines. Over time, the conspiracy can persist even after the initiating conditions disappear. Once a belief environment is established, no single actor controls its trajectory. 

Blowback and Domestic Contamination 

Conspiracy theories do not respect audience boundaries. Narratives introduced for foreign influence often circulate back into domestic information environments through social media, diaspora networks, and open information channels. Once internalized, these beliefs can target domestic institutions, officials, or media ecosystems that were not part of the original objective. This contamination is difficult to contain because conspiracy narratives travel through peer networks rather than formal channels. Blowback is especially acute when conspiracy theories erode generalized trust. Distrust rarely remains selective. Suspicion aimed at one authority spreads to others. Over time, this weakens institutional legitimacy and complicates governance. For information warriors, the risk is cumulative. Short-term gains against an adversary can produce long-term instability at home. Once domestic belief systems are affected, reversal is slow and often incomplete. 

Institutional Self-Damage 

The use of conspiracy narratives carries long-term costs for institutional credibility. When influence operations rely on framing authorities as deceptive or corrupt, those frames do not remain isolated. Over time, cynicism generalizes. Audiences become skeptical not only of targeted institutions but of institutional authority as a whole. This erodes the shared assumptions required for effective governance, coordination, and crisis response. Institutional self-damage also limits strategic flexibility. Actors that routinely rely on conspiratorial framing reduce their own ability to communicate credibly in future operations. Messages intended to reassure, de-escalate, or mobilize may be dismissed as manipulative. Once trust is degraded, it cannot be selectively restored. For information warriors, this creates a paradox. The tools that weaken an adversary’s legitimacy can also undermine the legitimacy of the user. 

Escalation Without Off-Ramps 

Conspiracy theories framed around existential threats tend to escalate without clear stopping points. Once narratives portray adversaries as hidden, coordinated, and morally corrupt, compromise becomes suspect. De-escalation is reframed as capitulation. Restraint appears dangerous. Each new development is interpreted as confirmation of hostile intent, reinforcing the cycle of suspicion. This dynamic reduces strategic flexibility. Because conspiracy beliefs transform uncertainty into moral certainty, they eliminate ambiguity as a resource for negotiation or adjustment. Positions harden. Reversal becomes betrayal. For information warriors, this creates a serious risk. Influence campaigns that rely on existential framing can generate momentum that outpaces control, locking audiences into confrontational postures with no credible exit. 

Implications for Strategic Communication and Resilience 

While this essay does not offer prescriptive guidance, the analysis suggests several implications for practice. Efforts to build cognitive resilience should focus on reducing chronic uncertainty rather than correcting individual claims, as persistent ambiguity creates the conditions for conspiratorial interpretation. Strategic communication should prioritize trust maintenance over message optimization, since conspiracy beliefs spread through social relationships rather than authoritative channels. Consistent institutional behavior, transparency about limits, and credible acknowledgment of uncertainty matter more than reactive rebuttal. Practitioners should also distinguish between tactical containment and strategic prevention. Debunking may be necessary in time-sensitive situations to protect specific systems or decisions, but it does not address the upstream conditions that sustain conspiratorial belief environments over time. 

Conclusion 

Conspiracy Theories as Cognitive Terrain in Modern Conflict 

Conspiracy theories are best understood not as collections of false claims, but as cognitive environments that manage meaning through? perceived existential threat. They impose order on uncertainty, assign agency to ambiguous outcomes, and supply moral clarity when trust erodes. The Existential Threat Model explains the narrative architecture that makes these systems compelling, While the conspiracy beliefs formation model explains why individuals become receptive to them. Together, they show that conspiracy theories stabilize through threat perception, social reinforcement, plausibility, and unfalsifiability. By the time violence, protest, or mobilization becomes visible, the cognitive terrain has already been shaped. 

For information warriors, the implication is that influence is decided upstream, at the level of interpretation rather than claims. Tactical interventions such as debunking may still be required for immediate operational security, but they do not substitute for shaping the broader belief environment in which conspiratorial interpretations take root. Conspiracy theories function as environments, not messages. They define what feels reasonable, whom to trust, and how to assign blame. At the same time, they carry serious risks. Once established, they are difficult to control. Narratives drift, blowback contaminates domestic audiences, institutional legitimacy erodes, and escalation hardens without off-ramps. Understanding conspiracy theories, therefore, matters less as a tool for use than as a feature of modern conflict. Information warfare is ultimately a struggle over meaning, identity, and legitimacy. Conspiracy theories reveal how that struggle unfolds when threat dominates perception. 

Summing Up 

Conspiracy theories are not anomalies at the margins of modern conflict. They are predictable outcomes of environments marked by uncertainty, distrust, and perceived existential threat. When these conditions are present, conspiratorial belief systems emerge as tools for restoring meaning, identity, and moral order. This essay has shown that such systems are neither spontaneous nor irrational. They are structured narratives that align with psychological vulnerabilities and social dynamics long before overt action occurs.  

For information warriors, the central insight is that conspiracy theories operate as cognitive terrain. They shape interpretation upstream, defining what audiences consider plausible, legitimate, and necessary. Attempts to counter them through fact correction alone routinely fail because they target claims rather than the conditions that sustain belief. At the same time, the strategic risks of weaponized conspiracy theories are substantial. Once released, these narratives drift, generate blowback, erode institutional trust, and escalate without clear off-ramps. Understanding conspiracy theories, then, is not about learning how to deploy them. It is about recognizing how belief environments form, persist and, constrain strategic choice. In contemporary conflict, influence is rarely decided at the moment of crisis. It is decided earlier, in the quiet shaping of meaning, threat perception, and trust. That is where modern information warfare is won or lost. 

The post Creating Conspiracy Theories: What Information Warriors Need to Know appeared first on Small Wars Journal by Arizona State University.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

China Amps Up Pressure on Japan With Restrictions on Exports

Beijing placed the restrictions on 20 Japanese entities with ties to the...

LIVE: Pakistan vs England – T20 World Cup

Follow our live build-up, team news, talking points, toss updates, score and...

Zelensky addresses the European Parliament: What are the main takeaways?

Ukrainian President Volodymyr Zelensky on Tuesday (February 24) — the fourth anniversary...

Replay: Zelensky addresses European Parliament as Ukraine marks four years since Russian invasion

Ukrainian President Volodymyr Zelensky on Tuesday (February 24) addressed MEPs at an...