The need for a comprehensive approach to strategy that’s relevant to the global information environment is made clear by recent cyber and information attacks. The Solar Winds and Hafnium attacks from US data centers occurred in a context of persistent disinformation campaigns (Russia, China). Yet the US cyber, info ops and law enforcement communities have…
This paper applies a narrative weaponization model to decision making (Observe, Orient, Decide and Act), using Iranian disinformation. Papers 23 and 24 did the same with disinformation from China and Russia. Understanding how narrative strategy works in the information environment is key to detecting and countering disinformation.
Military operations must be prepared to conduct so-called “great power competition” as well as big and small wars just as complex. In all cases, we need to implement superior strategy to defeat clever competitors.
This sortie is a follow-on to ICSL Paper #28 which showed how critical thinking errors lead to exploitation. Our focus here is on freely available platforms and programs that can track and destroy disinformation.
Disinformation is a global threat. Pervasive digitized technology and social media provide rich opportunities to distort public perceptions at scale. Authoritarians assail democracies incessantly.
Authoritarian states are weaponizing supply chains into all-effects warfare while democratic states compete with inferior strategies. We can be more competitive and wage superior complex warfare in kind.
Supply chains are vital to socio-economic well-being and military success. They have become arenas where authoritarians wage complex warfare while democracies compete with inferior strategies.
As a follow-on to China’s strategy, we show how Russia’s use of narrative reorients decisions in an Observe-Orient-Decide-Act (OODA) Loop. The distortion of information is not just divisive. It envelops the “when-deterrence-fails” US approach to warfare.
Colonel John Boyd’s OODA Loop—Observe, Orient, Decide, and Act— is a powerful model for making decisions in contested environments. Strategic use of information can defeat it. Understanding narrative strategies can protect it.
This Note paraphrases today’s webinar from the Alliance for Security Democracy on Hamilton 2.0, a dashboard on Russian, and now Chinese, disinformation.
Warfare has become all-domain, all-effects, and all-information. This reality is thriving in a comfort zone outside our entrenched concept of a “threshold of armed conflict.”
State-sponsored cyber attacks against critical infrastructure are increasingly pervasive. Their global presence and effective methods are asymmetric, coercive, and debilitating.
We must also seek solutions that limit the effects of disinformation. This effort starts with leaders recognizing and publishing Russian exploits as they are discovered. Overt exposure of Russian methodology goes a long way in limiting the effectiveness of false narratives. Investigations should identify who is targeted in hacks, why they were chosen as targets, what information has been stolen, and the extent of related penetration.
The question of what and whom to trust applies to all situations because uncertainty is pervasive. In the information environment (IE), the overriding context of trust is that it’s contested. Actors fight for the kind of information and people they need to compete and prevail. Four types of competition become apparent when we consider four contested purposes of strategic and anticipatory analysis: