Paper #28. To Defeat Disinformation, First Arm the Mind

  • Thomas A. Drohan, Ph.D., Brig Gen USAF ret.
  • Asia-Pacific, Cyber, Eurasia, Middle East & North Africa, Strategy
  • No Comments

Disinformation is a global threat. Pervasive digitized technology and social media provide rich opportunities to distort public perceptions at scale. Authoritarians assail democracies incessantly. Comparitech recently discovered a Facebook bot farm that controls nearly 14,000 fake accounts and produces 200,000 posts per month.

Our knowledge of circumstances has increased, but our uncertainty, instead of having diminished, has only increased. The reason of this is, that we do not gain all our experience at once, but by degrees; so our determinations continue to be assailed incessantly by fresh experience; and the mind, if we may use the expression, must always be “under arms.”

Carl von Clausewitz, On War, Book 1 Chapter 3.

Official Russia’s disinformation has long used kompromat and active measures to confuse, divide and amplify perceptions that attack independent thought (see Distorting the Loop). Official China’s disinformation similarly employs faulty cause and effect, selective evidence and false analogies to peddle a domestically oppressive ideology that foreign investors are careful to not offend (see Collapsing the Loop). Both authoritarian regimes seek to weaken democracies from within, where openness becomes a vulnerability when consent of the governed is weak. To defeat disinformation, we must first arm our minds.

This paper suggests how. Our focus is on exploitable errors in critical thinking. Recognizing such errors is essential to preserving independent thought, an inherent strength of democracy.

Critical Thinking

We examine three types of faulty reasoning—heuristics, logic errors, and cognitive biases.

Heuristics are mental shortcuts that skip details so we can move on to the next thought. Like Daniel Kahneman’s System 1 thinking, such generalizations enable us to make sense of what is going on, quickly.

Logic errors are mistakes in arguments that render their conclusions invalid. An argument is a claim based on a premise with supporting ideas backed up by evidence. The logic of the argument can be culturally interpreted in many ways, but it basically connects premises with conclusions.

Cognitive biases, as presented by Richard Heuer, are “mental errors caused by our simplified information processing strategies (111).“ Like optical illusions, our predispositions affect how we evaluate evidence, perceive cause and effect, and attribute intent.

Failing to recognize fallacies creates opportunities for disinfo-influencers. It’s on us. With our unwitting cooperation, threats cause effects that otherwise would require confrontation. Then when confrontation is employed, a synergy of coop-frontation is more effective.The arsenal of combined effects is inexhaustible, but may be framed as eight-fold: cooperatively persuading-dissuading, and inducing-securing; and confrontationally compelling-deterring, and coercing-defending. Look for these cooperative & confrontational, causative & preventive, psychological & physical (complex!) combined effects in our examples.

With practice, we can spot critical thinking fallacies to expose and hunt for disinformation attacks. Superior awareness is a necessary step to defeating disinformation.

So strap in for our sortie. The flight plan consists of two routes: (a) a short definition of each fallacy and what it can lead to; and (b) examples of disinformation that can exploit that mental vulnerability. We’ll limit the primary examples to three sub-types, but there are dozens more to study. For a comprehensive list, see The Decision Lab (heuristics and biases), Kreativ Copywriting, (logic errors), and Human How (cognitive biases).


  1. Availability of Information, Leading to Thinking that Recalled Information is True: what we are able to recall from available information is what we think is generally true, such as with respect to: (A) threats to national security; and (B) the representativeness of one’s own government.
  • Exploitation: Chinese and Russian narratives repeat themes that carve out this heuristic in receptive minds. This strategically significant tactic creates a macro among captured audiences to the point that fed information becomes unexamined truth. Russian disinformation, for instance, perpetuates two ideas: (A) all governments need to counter the threat of “color revolutions” (such as the West’s purported Orange Revolution in Belarus); and (B) Russia and China are democracies. Passive acceptance of these well-hammered themes dissuades support for a democratic opposition and secures the position of authoritarian elites.
    • Combined effect: dissuasion and security.

2. Affect, Leading to Feelings in Control of Rational Thought: allowing or using evocative information to create powerful emotional reactions that, even if disproven, persist (see Ch 2).
  • Exploitation: Russian narratives use multi-layered, emotionally charged language to claim that the European Union abandoned Italy during the coronavirus pandemic; Italians died without any help (EU aid, investment, and treatment of patients falsify this claim). The poignant storyline is contrasted to President Putin’s noble provision of doctors and equipment (Russia did send a medical team to Italy). Like a matryoshka doll, this affective argument contains nested truths marketed to reinforce any confirmation bias that ”likes” to believe the claim. Together, the disinformation persuades and secures conservative audiences that the West is not as unified as Russia.
    • Combined effect: persuasion and security.
  • In a failed attempt at exploitation, Chinese artist Ai Weiwei circulated an emotion-charged meme (shown below) on Instagram, presumably to induce attendance to his upcoming production of Turandot in Rome. Many Italians were not persuaded as they perceived the joke as insensitive, insulting and ungrateful. The opera performance was suspended anyway due to the pandemic.

3. Anchoring, Leading to Use of the Same Information as a Basis for Conclusions: becoming attached to an initial piece information, against which comparisons and irrelevant conclusions are drawn.
  • Exploitation: In the Tweet below, Supreme Leader Ali Khamenei deflects attention away from the heavy costs of Iran’s domestically unpopular foreign interventions by dropping an alternative anchor: the American military presence in the region. The Ayatollah labels the US military killing of Qasem Soleimani in Iraq an “assassination” (Soleimani was a uniformed combatant, a commander— sardar—of the Quds Force killing coalition forces and civilians). The set theme blames the US presence in Iraq as the cause of this “American Crime” (rather than the consequence of Soleimani’s armed attacks). Given Khamenei’s authoritative religious and political status in Iran, the message persuades and induces the Iranian Revolutionary Guard Corps and regional proxies into action.
    • Combined effect: persuasion and inducement.

Logic Errors

  1. Non-sequitur, Leading to Illogical Conclusions: a conclusion that does not logically follow from its assumptions, evidence, or a preceding argument. An unexpected disconnect that can be humorous.
  • Exploitation: Russian disinformation tactics incorporate non-sequitur as jokes to deflect attention away from damning arguments and inconvenient evidence (see Claire Wardle on ”SATIRE OR PARODY”). Being comedic can disarm and alarm audiences by dismissing threats and creating uncertainty. Knowing an audience’s clickbait predisposition is key, as in this CNN interview with President Vladimir Putin: “When asked about concerns the Russia might interfere in the 2020 US elections, he replied: ‘I’ll tell you a secret: Yes, we’ll definitely do it,’ Putin said. ’Just don’t tell anyone,’ he added, in a stage whisper.“ This act(ing) induces polarized American political opinions and deters unity on the issue of election interference from Russia.
    • Combined effect: inducement and deterrence.

2. Over-simplification, Leading to Unexamined Arguments: a claim that omits alternative arguments, assumptions and evidence that are relevant.

  • Exploitation: Hu Xilin, editor of the Party-controlled Global Times, Tweeted a simplistic description (below) of the US ordering the closure of China’s consulate in Houston, Texas. Hu failed to mention why the US ordered the closure of the consulate — charges of espionage and intellectual property theft. This gloss-over of the issue and labeling of the move as “crazy” might persuade anti-Trumpers, but more importantly secures Chinese authorities from having to defend Beijing’s operations in the US as benign. Twitter and other venues with limited character counts incentivize the simplism.
    • Combined effect: persuasion and security.

3. Circular Reasoning, Leading to Failure to Disprove an Argument: a circular argument repeats its claim using different words to “prove” its validity (authority-based), rather than disprove the claim’s assumptions or evidence (science-based).

  • Exploitation: official China repeats righteous principles for domestic legitimacy and international prestige while systematically violating those principles. This language insulates the Party‘s authority against falsification. How?
    • Note the two Tweets on the right (below) by China Foreign Ministry Information Department spokesperson Hua Chunying. The message exhorts the world to combat disinformation and ideological bias even as the Party propagates a huge wave of false facts and spoils another generation’s education by inculcating Xi Jinping Thought.
    • Messaging at the level of principles reiterates Beijing’s fake values in generalities—solidarity, coordination, global opinion, rejecting double standards and interference, opposing slander, and…get this…condemning ideological bias. This reasoning (a) persuades ingratiated or intimidated influencers to parrot the Party line, and (b) compels domestic compliance.
    • Combined effect: persuasion and compellence.
  • The Tweet on the far left takes the reader to an article in the Party-controlled Global Times that ridicules US closure of China’s consulate in Houston. The author employs faulty cause and effect, selective evidence and false analogies.
    • By exercising a freedom of expression that is banned in China, Chinese officials asymmetrically weaponize information.
  • Recognizing that the two examples above are tautological due to their vagueness is a fine point. It’s better to disprove the generalities with evidence:
    • China’s support of the Palestinian people is minimal, rhetorical and based on non-intervention even as Beijing calls for wars of national liberation.
    • China’s support for self-determination excludes the expansionist domains of imperial China itself. Witness Beijing’s extermination of Uighur culture in Xinjiang Province, feigned ignorance of the ”different interpretations“ half of the 1992 Consensus with Taiwan (“one China, different interpretations“) while threatening war, human rights violations in Tibet, militarization of maritime territories while rejecting international law, and broken promises to respect Hong Kong’s domestic autonomy until 2047.
  • Beijing‘s objectives are clear: induce Palestinian support of China’s foreign policy, which helps persuade developing countries to accept Chinese investments and influence.
    • Combined effect: induced persuasion.

Cognitive Bias

1. Confirmation Bias, Leading to Manipulation by Tailored Narratives: a tendency to believe information that reinforces one’s pre-existing values and beliefs. Narratives shape perceptions of reality by filtering and echoing “new” information to fit and shape what people want to read, hear and see.
  • Exploitation: the front page of Iran News (privately-owned, government-censored) is always filled with headlines that reinforce what the government wants citizens to think. Take the seven headlines on July 26th, 2020. It’s all pro-government and anti-US news except, it seems, for the COVID-19 death count (that’s a set-up for page 7’s critique of Presidents Bolsonaro and Trump).
    • Front Page Stories
      • Era of Judicial Hit-And-Run Is Also Over: responding to a standard interception of an Iranian airliner in Syria by two US fighters, the article promises to protect Iranian airline passengers’ right to safety. The event induces anger by describing US behavior as bullying, harassment, a threat to international security and humanitarian law, and another example of US terrorism.
      • Iranians Never Succumb to Pressure: President Rouhani cites US sanctions and the coronavirus as pressure, exhorting citizens to not despair. Emphasizing that the government is doing everything it can without foreign assistance persuades compliance: “If there are 10 more pressures that tend to divert people from the right path, people will make every big problem easy with cooperation.”
      • No End to U.S. Criminal Acts: Iranian ambassador to the International Atomic Energy Agency attempts to persuade key officials that the US is guilty of aviation terrorism as well as economic terrorism (US sanctions against Iran for nuclear non-compliance), and a threat to international peace and security.
      • Washington Fails to Isolate Iran: a former Iranian representative to the United Nations cites the 25-Year Iran-China Cooperation Plan as strategic cooperation in sectors that include energy, finance, currency and banks. Partnership with China may induce more European defections from US-led sanctions and persuade international opinion that the US opposes cooperation with China.
      • All Must Respect Safety of Civilian Flights: US fighter aircraft are accused of illegally being in Syrian airspace and aggressively maneuvering near an Iranian airliner. By requesting the International Civil Aviation Organization to investigate the incident, Iran seeks to persuade and induce legal claims against the US.
      • Company to Make Purchases on Iran’s Behalf to Settle Iraq’s $5b Debt: because of US sanctions, the $5b Iraq owes to Iran for gas and electric purchases cannot be transferred to Iranian banks. An Iran-Iraq agreement establishes a company in Iraq to secure $5b worth of goods, commodities and raw materials to be transported to Iran.
      • 195 More Iranians Die of COVID-19 in 24 Hours: a Health Ministry spokesperson reported 195 deaths, bringing Iran’s self-count to 15,484 deaths. The number of infections reported is 2,316 for a total of 288,839 to date. The Ministry is desperate to persuade Iranians that the government is doing all it can to combat the pandemic.
    • Combined effect: inducement, persuasion and security.

2. Using “Self-certainty” to Cope with Complex Uncertainty, Leading to Full Acceptance or Full Rejection of Alternative Solutions: this popular approach to leadership during uncertain times advises bosses to assert their values and be decisive.

  • Exploitation: the priority of “self-certain” leadership is to reassure others that everything will be ok. Let’s apply this to the US political context: one federal government sharing decision-making with 50 state governments.
    • Given the pandemic, we define “Full Acceptance or Full Rejection of Alternative Solutions” with respect to enforcing or not enforcing mask-wearing, social distancing and the lockdown of public spaces. What do we see? At the level of governorship, coping with pandemic uncertainty has generated some extreme solutions. Contrast California’s aggressive lockdown of citizens from public beaches (while infection rates decreased) with Florida’s re-opening of public beaches (while infection rates increased). Even though most states have enacted moderate policies based on best practices (see National Governors Association) disinformation exploits the extremes.
    • The lack of a standard national policy as US infection rates rise (even though death rates decrease) is roundly exploited in social media by Chinese and Russian officials as “proof” of inferior US governance. For a sampling, see recent Hamilton 2.0 analyses that point to:
      • Russian confidence in developing a vaccine, a claim placed in sharp relief against fragmented US and EU efforts.
      • Chinese criticism of US coronavirus responses as incompetent.
      • Chinese promotion of a conspiracy theory about the origins of the virus (not China).
    • Russia induces political polarization while China persuades customers of its benevolence despite its star role in propagating the coronavirus.
    • Combined effect: inducement and persuasion.

3. Overconfidence in a Pattern, Leading to Vulnerabilities: habituated behavior and expectations can be reinforced to continue the pattern or break the pattern to create surprise.

  • Exploitation: reinforcing or breaking a pattern to elicit an anticipated response is easier when there is a reliable overarching pattern: polarized politics during an election year. The killing of George Floyd by a police officer in Minneapolis ignited a firestorm of cued up actors craving to respond to such tragedies for reward and advantage.
    • In addition to peaceful protestors seeking justice and reforms, there are: provocateurs seeking anarchy; petty and organized criminals seeking wealth; and politicians seeking votes. Conglomerates of groups and individuals continue to emerge. Some actions are intentional and many are not. As a result, complex aggregates are inducing, compelling and coercing escalatory violence.
    • Disinformation thrives in this environment. Without having to plan specific effects, any persona can wreak information havoc from each domestic protest, demonstration and crime. How? By anticipating violence. Knowing that opportunistic groups, proxies and fundraising platforms will compete for attention, all a semi-competent disinformationist has to do is add propellant to existing fires. Motives vary. For instance, sensation-seeking spammers with anonymous accounts fake live streams to produce top search results for click-paid page owners.
    • Disinformation does not have to be fake information. Any actor can weaponize information into disinformation by using real events and genuine content. Here are two common ways:
      • Information can be put into false context, such as the murder of George Floyd wrapped in false statistics. The polarized opinion on the extent of police brutality compared to the extent of neighborhood brutality is a contentious case in point.
      • Information can be manipulated to deceive, such as “breaking news” duels among liberal and conservative media outlets. The former tend to show police using force against peaceful protestors, while the latter tend to show activists using force against dutiful police.
    • Equipped with such genuine content and real events, disinfo artists can paint any picture they want by adding context and impact.
    • Combined effect: inducement, compellence and coercion.

Sortie complete, but we are far from mission accomplished. A de-brief critique: I over-simplified the combinations of effects that disinformation can produce, knowing that eight effects is more complex than most readers can bear. Let’s go there anyway.

Our sample of combined effects yielded dissuasion & security, persuasion & security, persuasion & inducement, inducement & deterrence, persuasion & compellence, induced persuasion, inducement & persuasion & security, and inducement & compellence & coercion.

Each of these disinfo exploits has its own context and particular details. That means our eight-part framework only models the reality of what is happening. Still, eight types of effects can yield 40,320 combinations of effects (8x7x6x5x4x3x2x1). That result assumes, the word order of effects matters. Word order should reflect two things: (1) standard grammatical interactions such as adjective-noun, which represents how the effects interact among one another to produce something different; (2) relative importance.

For instance, “induced persuasion” as described in our sample (China’s economic-to-political effect on Palestine and the developing world) would be different from “persuaded inducement” (this could be a political-to-economic effect, such as China’s politics promoting economic alignment).

The implication of all of that simplified complexity is that we should anticipate combined effects, not just one or the other. The problem is, our national security leaders routinely refer to deterring conflict first, then defending ourselves once “war” breaks out. The reality is, our adversaries are confronting and cooperating all at the same time. Indeed we face a more complex challenge—deterring, compelling, defending, coercing, dissuading, persuading, securing and inducing.

Critical Public-Private Partnerships

The first step to defeating information is to arm the mind. Yet, there is a relative lack of public familiarity with disinformation tactics and purposes. Private sector businesses can play a critical role: educate citizens about this pervasive threat and create flexibility that complements government programs.

The National Cyber Center’s (NCC) Student Alliance focuses on cyber innovation and awareness in K through 12 education. Certified courses, summer camp, and student-run chapters develop skills and practice leadership to meet the growing demand for cyber talent. NCC’s broad portfolio also includes a Secure the Vote initiative to increase voter confidence in vote-counting and expand awareness of shortfalls. Potential solutions for particular jurisdictions include a secure, auditable mobile voting option.

JMark Services Inc.’s curriculum in cyber training and information environment education blends essential problem-solving with strategic understanding. Webinars and a think tank develop topics for discussing locally relevant, globally significant issues.

We also need to hunt for and expose the sources of disinformation. Here, too, private sector entrepreneurship fills important gaps in national capability.

The Cyber Resilience Institute conducts c-Watch cyber intelligence training, currently a three-week immersion in intelligence, social media, and international cyberspace conflict. Open source platforms enable access. All citizens need are the analytic skills to leverage platforms and resources.

Across the space industry, the Space Information Sharing and Analysis Center (ISAC) facilitates information-sharing to address vulnerabilities, incidents and threats. The ISAC in Colorado Springs is the first space ISAC in the US and recently became a member of the National Council of ISACs. The focus on all-threat analysis and mitigation is applied to supply chains, business systems and mission sets.

Beyond these efforts, we need a superior US grand strategy. Everything depends upon information, digitized or not. Information venues are highly competitive spaces where adversaries seek advantageous combinations of effects.

Whether we are developing human or machine learning, our first capability should be in reasoning. Not only to recognize fallacies and flaws in human and artificial software, but also to guard against weaponized information. Even with our all-domain operations and all-threats analyses, we confront authoritarians’ all-effects strategies. There is great work to do.

Author: Thomas A. Drohan, Ph.D., Brig Gen USAF ret.

Leave a Reply