We must make realistic assumptions within our policy, legal, and ethical restrictions and proactively plan to win and win over competitors in the AI Age.
All the techniques used in the JMark Services Information Environment Advanced Analysis (IEAA) course apply to supply chain networks. I’ve selected three and added a new fourth type based on Project Socrates (see endnote xi). Our first step is to characterize the IE to understand the battlespace. We begin with baselines used in IEAA practical…
If democracies are to compete with savvy authoritarians, we need to up our game in the artificial intelligence (AI) information environment (IE), where out-thought is outfought. Beyond a profession of arms with all-domain military strategy, we need a profession that integrates all effects. Such a profession of effects begins with strategy. Competitive Strategy In an…
The Authoritarian Threat Democracies do not recognize the all-domain, all-effects warfare that authoritarians wage unless there is direct state-sponsored violence. Instead, we wage “when-deterrence-fails” lethal warfare. Authoritarians exploit the blindspot with informatized operations that seize territory, disrupt adversary control, centralize control, keep opponents divided, and set the terms for peace. The strategy is competitive, effective…
Net Assessment The purpose of net assessment is to gain an asymmetric advantage over competitors. US goals generally seek technological superiority. The US Office of Net Assessment, in a rare run of leadership continuity (Andy Marshall, 1973-2015), analyzed strategic competitions and recommended offsets against adversary strengths. Some offsets threatened the mutual vulnerability of Mutual Assured…
This article completes our series on AI-assisted strategy, but with a stronger emphasis on combined effects. I use the language of combined effects strategy. Combined effects strategy is a broader alternative to the prevailing paradigm of combined arms that dominates failed US security strategy. Unlike papers #42 and #43 that focused on either cooperative or…
Continuing our march through the eight basic combinations of strategy introduced in Paper #39 (The Strategy Cuboid), we focus on confrontational-physical competitions (preventive and causative). We‘ll use Savant X Seeker’s hyper-dimensional relationship analysis as a research assistant. The text corpus continues to expand as I add more curated reports and articles. The sample, however, is…
The Strategy Cuboid introduced in Paper #39 offers eight basic combinations of strategy in three dimensions: cooperative-confrontational; psychological-physical; and preventive-causative. We focus here on the two combinations that are cooperative-physical (preventive-causative), such as defense and economic infrastructure. As an exploration of competitive strategies, we’ll use the Savant X Seeker hyper-dimensional relationship analysis platform introduced in…
Strategy for dynamic end-states must be multi-dimensional to be competitive in the information environment (ICSL Note #22). If operations are not informing and influencing, they become existential rather than instrumental. They justify themselves, which makes for poor strategy. Yet strategy is the competition that matters most for relevant operations. As we consider the three basic…
Information is foundational to competitive strategy because it permeates technology and cognition in all dimensions. We need to integrate information as operations to win all-effects warfare. The Problem Current joint military planning focuses on ways and means in the operational environment for integrating with other national instruments of power.[1] In the Information Age, this wins kinetic engagements but…
The need for a comprehensive approach to strategy that’s relevant to the global information environment is made clear by recent cyber and information attacks. The Solar Winds and Hafnium attacks from US data centers occurred in a context of persistent disinformation campaigns (Russia, China). Yet the US cyber, info ops and law enforcement communities have…
This paper applies a narrative weaponization model to decision making (Observe, Orient, Decide and Act), using Iranian disinformation. Papers 23 and 24 did the same with disinformation from China and Russia. Understanding how narrative strategy works in the information environment is key to detecting and countering disinformation.
Military operations must be prepared to conduct so-called “great power competition” as well as big and small wars just as complex. In all cases, we need to implement superior strategy to defeat clever competitors.
This sortie is a follow-on to ICSL Paper #28 which showed how critical thinking errors lead to exploitation. Our focus here is on freely available platforms and programs that can track and destroy disinformation.
Disinformation is a global threat. Pervasive digitized technology and social media provide rich opportunities to distort public perceptions at scale. Authoritarians assail democracies incessantly. Comparitech recently discovered a Facebook bot farm that controls nearly 14,000 fake accounts and produces 200,000 posts per month.
Authoritarian states are weaponizing supply chains into all-effects warfare while democratic states compete with inferior strategies. We can be more competitive and wage superior complex warfare in kind.
Supply chains are vital to socio-economic well-being and military success. They have become arenas where authoritarians wage complex warfare while democracies compete with inferior strategies.
As a follow-on to China’s strategy, we show how Russia’s use of narrative reorients decisions in an Observe-Orient-Decide-Act (OODA) Loop. The distortion of information is not just divisive. It envelops the “when-deterrence-fails” US approach to warfare.
Colonel John Boyd’s OODA Loop—Observe, Orient, Decide, and Act— is a powerful model for making decisions in contested environments. Strategic use of information can defeat it. Understanding narrative strategies can protect it.
This Note paraphrases today’s webinar from the Alliance for Security Democracy on Hamilton 2.0, a dashboard on Russian, and now Chinese, disinformation.
Warfare has become all-domain, all-effects, and all-information. This reality is thriving in a comfort zone outside our entrenched concept of a “threshold of armed conflict.”
State-sponsored cyber attacks against critical infrastructure are increasingly pervasive. Their global presence and effective methods are asymmetric, coercive, and debilitating.
We must also seek solutions that limit the effects of disinformation. This effort starts with leaders recognizing and publishing Russian exploits as they are discovered. Overt exposure of Russian methodology goes a long way in limiting the effectiveness of false narratives. Investigations should identify who is targeted in hacks, why they were chosen as targets, what information has been stolen, and the extent of related penetration.
The question of what and whom to trust applies to all situations because uncertainty is pervasive. In the information environment (IE), the overriding context of trust is that it’s contested. Actors fight for the kind of information and people they need to compete and prevail. Four types of competition become apparent when we consider four contested purposes of strategic and anticipatory analysis: