Prevailing in an operational environment does not matter if one loses in the information environment. Vietnam Workers Party nationalists understood this. Taliban religious fundamentalists understand this. Why do we not, and what to do about it?
First, we need to face four facts:
From those facts, it’s clear that winning wars in any environment requires completing the other half of the job: informatizing operations. That is, operations need to create superior information effects to win more than military engagements.
A fundamental reason for failing to win “forever wars” is that we don’t recognize them as information wars. And it doesn’t help that our Department of Defense definitions of “information” (pp. 104-105) are logical fallacies. The definitions are tautologies that use information to define information:
This conceptual failing is an operational failing with strategic significance. Why?
The way we define information needs to be testable so decision makers can assess the effectiveness of operations and campaigns, which is the whole point of waging them. As in any scientific theory—hypotheses have to be disprovable, not just confirmatory. Without a falsifiable definition of information, all of the above definitions related to information justify themselves. This contributes to confusion over the value of information and reinforces the ill-conceived notion that information is distinctly separate from operations.
For instance, prominent military leaders have pointed out the limits to killing or capturing your way to victory, while others contend that “it takes killing with speed and sustained effects to win wars.” Advocates of the former tend to recognize the continuity of conflict and argue for effective rules of engagement or more comprehensive engagement. Advocates of the latter are inclined to make sharper distinctions between war and peace.[1]
Assessment itself tends to be politicized into generalities that selectively ignore what works and what doesn’t work in the battle-space. This timeless tendency has more impact today due to the scope and scale of the global information environment. Despite the persistent Clausewitzian characterization that “everything in war is very simple, but the simplest thing is difficult,”[2] today’s wars are not simple and are difficult. Winning them requires superior combinations of diplomatic, informational, military, economic, and social effects. Strategists at least need an assessable definition of information.
In ICSL Paper #9, we used Robert Losee’s definition of information because it can be assessed: “the values of characteristics in the output of processes.” This paper expands that definition to include more than the outputs of processes. The inputs of processes, and how change occurs, are also important to a competitive strategy.
So we define “process” generally in terms of three concepts: inputs; change, and outputs:
Our broadened definition of information becomes:
Information: the values of characteristics in the input, change or output of processes.
Any process has characteristics to which various actors assign values. A physical targeting process typically values location, time, speed, lethality, and duration. Psychological targeting tends to value how processes are perceived and acquire meaning. Both types of targeting are needed to achieve operational priorities that last over time.
So, what’s not information in this definition? If the phenomenon is not related to a process (input, change, output), then for our purposes it’s not information. It’s data that lacks useful context. An example of a non-process would be data that is arbitrary or without order, or is a static condition. A chaotic condition or state of nature with no discernible patterns lacks information, since we have defined information in terms of a process. Closed systems that exclude inputs from the outside are a mix of controlled data with limited information. Such as programmed behavior that follows set rules or principles, or belief systems that resist changes to fixed interpretations. A permanently static condition lacks information, according to our definition. Why define information this way?
A process-oriented definition of information is important because we observe that global information is dynamic, uncertain, and subject to multiple meanings. Decision makers need information on all sorts of processes to discern the effects of operations. Measures of performance and effectiveness are particularly challenging in diverse contexts. Consider one of the most sought-after effects that many associate with security: “stability.”
US joint military doctrine defines stability in terms of economic and political instability. Instability has been largely interpreted to mean violent conflict (p. ix). The predominant approach is to generate activities that combine ”defeat mechanisms” (violence against combatants) with civ-mil operations to compel, control, influence and support non-violence. The current doctrinal frames of reference for stability are two-fold and too narrow.
First, doctrine prescribes offensive, defensive and stability operations/actions (stability “operations” were recently amended as, stability “actions” to clarify missions, activities and tasks).
The offensive has long been a principle of war. Its application typically involves movement to contact, attack, exploit and pursue. The purpose of the offensive is to destroy a threat. The problem is that despite this narrow effect, offensive operations are expected to be broadly decisive.
Defensive operations are expected to protect the ability of a commander to conduct offensive operations. The problem is that defensive operations are not meant to prevent many of the undesired effects related to offensive operations.
Stability actions are essentially armed nation-building, until host nation civilian agencies can do so. The problem is that their broad effects across governance, the economy and society exceed military capability and authorizations.
The overall problem is that the use of offense, defense and nation-building to destroy threats and construct a non-violent and therefore “stable” end-state is limited to operational environments where changing the physical state is most important. This narrow assumption does not fit much of warfare today.
The reason for this misfit is that the information environment permeates all operational environments. The information environment is highly contested and produces psychological effects. Any physical effect in any operational environment generates, and is influenced by, psychological effects.
To appreciate how physical effects cannot be separated from psychological effects, consider an extreme example—thermonuclear mass destruction. Nuclear deterrence is based on mutual assured destruction among nuclear powers. While some see this predicament as stable nuclear non-violence, others see an unstable nuclear balance of terror. Why does this difference matter operationally?
Let’s apply our definition of information. Stating the condition of nuclear deterrence in terms of two key characteristics, we get:
Together, the two characteristics are valued as information and exploited by operations in different ways by different actors in different contexts. The inability of a superior nuclear or conventional offense to win “small,” disinformation, or hybrid wars indicates the importance of the information environment. If operations do not create desirable psychological effects in targeted audiences, how can they be strategically effective?
Because information power is not simply physical, strategic actors manipulate psychological information to gain advantage. Therefore our assumptions of offensive, defensive and stability operations/actions need to be broadened to achieve rather than concede information advantage. A narrow offense is not decisive in information-rich environments where destruction is more than physical. Such as in the financial and psychological battle-space, where supply chain and narrative warfare out-flank military firepower. An effective defense must do more than secure the offense’s freedom of maneuver or enable a counter-offensive. Such as information “defense” that acts as an active offense by proactively deflecting, containing and destroying malware and narratives. Stability requires more than the absence of violence. In most circumstances, stability requires setting conditions for non-violent effects, such as preventing foreign interference in domestic elections.
The second doctrinal shortcoming with respect to stability is the “competition continuum” below armed conflict. This concept frames competition as stable non-violence, in-between cooperation and conflict. Conflict, like instability, presumes the use of violence. The doctrine that introduces the competition continuum calls for an integrated campaign of cooperation, competition and armed conflict. Without explicitly saying so, this integration implies a shared understanding that cooperation and armed conflict are competitive. Both seek to achieve desired outcomes in contested environments. Even a cooperative “blue ocean” business strategy that seeks uncontested market space competes with limited resources and eventual entrants into that space.
There are many types of cooperation and conflict, with and without violence. By broadening the term “conflict” into “confrontation,” we can account for interactions and relationships that are not cooperative, yet do not necessarily use violence. In this larger sense, conflict is a violent form of confrontation. Why does this matter?
There are two strategic advantages in using cooperation and confrontation as endpoints of a competitive continuum.
First, strategists can be charged with anticipating ways that adversaries blend cooperation and confrontation in “armed” and “unarmed” conflict. To the extent we have defined armed conflict in terms of arms (means) rather than effects (ends), we fail to account for threats posed by “unarmed” weapons. Such as cyber and information tools that create destructive economic, political and social effects in support of a commander’s intent (an authoritarian one with broad authorities, for instance).
Second, a cooperation-confrontation continuum does not regard conflict as a redline of last resort. Indeed, within the Law of Armed Conflict (LOAC) itself, the use of lethal force is neither a last resort (surprise attacks against combatants) nor unaccompanied by non-lethal confrontation (psyops).
*Note: the law of armed conflict permits the use of force against combatants as a first resort, while most domestic law enforcement and international human rights law proscribe lethal use of force as a last resort (on this distinction, see Yoram Dinstein, p. 705).
Outside of LOAC, non-violent confrontation waged by “non-combatants” include cyber theft and disinformation. Even if domestic laws do not recognize these as warfare, strategists need to recognize how these activities can shape the battle-space for what is legally recognized as warfare. This brings up the bigger problem.
Determining the effectiveness of publicly accountable operations depends upon their purposes. The purposes need to be specified in clear terms such as, what to prevent and what to cause?
Unless the effects are specified, our concepts of offense, defense and stability promote circular justifications of operations. Here are two reasons. First, vague purposes are easily construed differently across and within organizations. Second, operations whose purposes are specified are likely to be more restrained to avoid conflict than adversaries are. These tend to be policy decisions, so what is a strategist to to do?
Strategists can specify vague purposes, such as stability, by refining the process of strategy (ends, ways, means) across three levels of effects: the tactical level of engagements, the operational level of campaigns, and the strategic level of policy priorities.
The approach involves three steps.
To organize a strategic process, we align responsibilities across functions to collaborate with otherwise vertically inclined bureaucracies. Examples of collaborative networking include operations centers, virtual groups and meetings that bring expertise together. We need to align a variety of analytical approaches and assumptions. Align a variety? The benefits of aligning diversity become clear when we consider two basic sources of diverse assessments.
Diverse assessments of threat intentions and capabilities vary among thoughtful participants. Commanders, intelligence communities and individual analysts assess threats differently depending on the relative weight they assign to historical and possible patterns, anomalies and innovation. Consider the the following information related to the Korean peninsula:
Another source of different threat assessments, one that is doctrinally overlooked, arises from multiple threat actors exchanging different interests. This can happen with or without assumed common threats. How?
Regional military exercises conducted by China and Russia in the South China Sea and in the Baltic Sea may not indicate a Russian commitment against Japan-US or a Chinese commitment against NATO-US. Rather, the collaboration may be a cooperative and confrontational relationship among competing Chinese and Russian military, economic and political interests. Such as: increased Russian arms sales to China, which are in decline; higher Chinese prices for Russian oil, which has not happened; and political support for territorial claims, which has happened.
What is the relevance of this complexity for the strategist? Ends, ways and means are subject to security bargains among stakeholders, rather than aligned to a common threat.
Our second step in organizing strategy is to value the inputs, changes and outputs of operations as tactical, operational, and strategic according to the relative significance of effects. That is, tactical effects are important to the extent that they help produce operational or strategic effects. This assumption is different than valuing tactical, operational and strategic functions as important for their own expertise (“it’s what we do because it’s what we train for”), or to create unit identity (“it’s what we do because it’s who we are”). Our overriding assumption that strategy should be organized to desired effects rests on the quality of strategic policy. Depending on our leadership, this can be a strength or a vulnerability.
The quality of information is contested and is directly relevant to operations because the processes of operations are ultimately, information. Information may be physical, as in chemical properties of matter than determine whether we are working with solid, liquid or gas. Or, the effect may be a combination such as plasma that involves less visible electro-magnetic forces. The information may be psychological, as in perceptions, ideas and expressions of will. These interact with physical matter such as neural networks to produce cognition, skills, and behavior.
Psychological effects tend to be more persistent than physical effects.
Third, taking strategy as a process that’s valued according to tactical, operational, and strategic levels of significance, we can implement a hierarchy of effort. This effort is not simply a top-down process of aligning strategic priorities, end state, objectives, effects, and activities. The connectedness of information matters.
Discerning the strategic intentions of relevant actors and recognizing outputs as information are key to orchestrating a hierarchy of effort. The hierarchy is a network, which requires setting the parameters of ends, ways and means so that compliant strategies develop at multiple levels. The strategies compete with adversary strategies in terms of trying to achieve holistic synergy, agile change, and asymmetric advantage. If the adversary’s strategy is more holistic than ours, we can be enveloped. If the adversary’s strategy is more agile, we become reactive at best. If the adversary’s strategy is more asymmetric, we become less efficient and ultimately less effective over time.
To achieve the causes and effects in the hierarchy of effort, a strategist needs to set desired conditions. Conditions are the outcomes of operations, and are more than military. If we are to have a competitive strategy, we need a broad concept of the environment that includes all operations in any domain.
The operational environment is also an information environment because operations generate information that influences behavior. The information is interactive, largely via non-linear processes where small inputs create disproportionately large outputs. Higher-level officials should be made aware of potential nth-order effects. Strategy has to be anticipated and orchestrated.
So we can refer to the operational and information environment as a strategic system. The analogy of an ecosystem explains the value of this perspective. An ecosystem refers to interactions among the inhabitants of a surrounding environment. The complexity of the environment in terms of self-organizing agents that interact, includes randomness with non-linear effects. Similarly, a strategic system characterizes the interactions of ends-ways-means among entities in operational and information environments.
A strategic system in an information environment involves complex interactions among causes and effects. Current doctrinal concepts describe the information environment as a system consisting of physical, informational and cognitive dimensions. By referring to the information environment as strategic, we can update these concepts for all-domain all-effects warfare.
The first key concept is the information environment. The reform is to eliminate the redundant, non-falsifiable concept of an information environment” consisting of an informational dimension and replace it with the following proposed concept:
The current conceptualization of the IE can be improved by eliminating the information dimension and overlapping the psychological (includes cognitive) and physical dimensions. Why?
“Informational” refers to how any sentient actor senses inputs, makes changes, and produces outputs, as well as places value on that process. That activity describes psychology or cognitive science, which in contemporary warfare is not necessarily human-centric. These processes involve any sentient actor and any data. Like the physical dimension, the psychological dimension is broadly tangible as determined by sensors of matter. In that sense the psychological dimension overlaps with the physical dimension in having matter, in addition to being data and sentient actor-centric. Besides those reforms, psychological cognition and physical matter also overlap because they include human and non-human actors.
To emphasize the importance of any actor placing values on inputs, changes and outputs, it seems more useful to characterize the psychological dimension as objectively data-centric and subjectively sentient actor-centric.
Once values have been attributed for each relevant actor, data becomes information. The above dual-centric psychological dimension reflects a need to understand how both machined weapons and bio-weapons process information.
The second key concept is the prevailing perspective of the operational environment,” which emphasizes systems that affect capabilities and decisions. This view promotes collaboration with partners beyond a joint force commander’s authority, so it is a strategic process. The reform needed is a review of conceptual gaps in civil-military authorities. These gaps need to be filled to ensure both democratic accountability and competitiveness with authoritarian actors.
Internal to the Department of Defense, there is a gap between the competition continuum short of armed conflict and the conflict continuum that equates conflict with war. In both continua, what we really need to do is specify the strategic value of information as it relates to acceptable and unacceptable competition.
Therefore to make sense of this strategic environment, we will treat the hierarchy of effort as two layers: information that gets operationalized into processes; and operations that get informatized with meaning.
When we regard the hierarchy of effort as operations, information becomes subordinate to running operations. We get the culture and the doctrine that we currently have—courses of action consisting of lines of operations (military) and lines of effort (more-than military) that converge at a “military” end state. But even at the highest level desired outcomes— strategic priorities—information becomes operationalized. Decision makers want to know what “stability” looks like in terms of operations.
Examples: markets are open; freedom of navigation is unimpeded; elections are freely conducted; violence is reduced. What do we do next? We keep on operationalizing information, running operations to seize the initiative for operational advantage. Informatized operations are a complementary opposite of this.
If we see the hierarchy of effort as a hierarchy of information, then operations are defined by, subordinate to, the information they input, change, and output. At the highest level of strategic priorities, “stability” for instance has to be clarified by information—informatized. Decision makers want to know stability looks like in terms of the conditions produced by operations. Political decision makers also want to know about the information consumed and processed by operations. Unlike any single type of operation, information is ubiquitous and an ever-combining effect.
Examples: open markets and freedom of navigation and free elections and reduced violence are at once, disseminated information. An important strategic question is, how will this combined effect interact and what will it cause? If we wait to react to whatever happens, we lose initiative and costs rise. Without action, authoritarians close markets and control navigation and rig elections and systematically use violence, and distort the facts with disinformation.
We have to gain operational advantage and win the information war. The latter requires recognizing that operations is information in a strategic system.
The information environment includes physical and psychological dimensions in which processes—operations—consist of information. There is no meaningful operational environment without information. The information environment acquires meaning with operations. This perspective has the benefit of specifying information as inputs, changes and outputs of processes (operations) which are more or less physical, and more or less psychological. Let’s break this down into elemental components.
Physically, digitized information such as particles and waves can both create new materials such as radioactive isotopes, and destroy them by rearranging their chemical compounds. The value of a creative, destructive, or transformative effect is what strategically matters in an operation. We’ve defined that value as information.
Psychologically, actor’s intentions inform behavior. Humans, animals, plants and AI estimate intent in the environment with sensors. The physics of uncertainty prevents prediction, so there is unexpected behavior. “You can measure the speed of a particle, or you can measure its position, but you can’t measure both…the more precisely you know the position, the less precisely you know its speed.” David Lindley, Uncertainty: Einstein, Heisenberg, Bohr and the Struggle for the Soul of Science, Anchor Publishing, 2008, p. 4).
Operations use, change and create information. As a physically destructive effect, information might rearrange the matter of a building into vapor and smaller particles. As a psychologically constructive effect, operations create meaningful information in the minds of targeted audiences that influence behavior.
Decision makers need information processes to control operations. Joint All-Domain Command and Control, for instance, seeks to develop a common architecture for putting data into context. How? By collecting, storing and processing data into information. Controlling operations is highly dependent upon information but more importantly, the results of operations are information. When we design a purposeful and accountable strategy that owns what it creates, operations are information.
Decision makers need to judge the relative importance of information according to a strategy of desired outcomes.
Operations depend on, create, and are information. However, our prevailing combined arms approach to warfare focuses on operationalizing information rather than informatizing operations.
Informatizing ops requires a broad, testable, and therefore assessable, definition of information. This is a process-centric definition that includes how different inputs of matter and ideas use energy to change them into outputs, which are interpreted. Information is not separate from operations. Information consists of physical and psychological processes. That is, operations. What information is not, is process-devoid chaos, data without context, or a static/closed condition.
As illustrated by our mainstay concepts of stability and nuclear deterrence, our Department of Defense tends to ignore psychological effects in favor of physical effects. This is a half-strategy at best. While we operationalize information via exquisite combined arms warfare, we create a huge vulnerability. Adversaries that informatize their operations create psychological effects that override our physical effects.
Who regularly informatizes operations? Ideologues do, such as a single-Party partisans, fundamental religionists, left-wing anarchists, and right-wing reactionaries. Practitioners of “talk-fight”[3]or “peace-war”[4] strategies informatize their operations via propaganda and social media. China’s all-encompassing “unrestricted warfare” includes wei shi (protect—deter and attack-coerce).[5] Russian deterrence includes sderzhivanie (restraint) and ustrashenie (intimidation).[6] ”Free” media often amplify one side or the other with selective coverage of what is divisive to their business rivals. Such extremism in reporting is an information process that is not confined one side of the political spectrum.
Instead of using information to define information, the Department of Defense should emphasize information as a definable process: the values of characteristics in the input, change, or output of processes. This definition includes physical and psychological processes that focus on inputs, how energy changes those into outputs, and how the outputs are perceived. This is a falsifiable definition of information because we can look at the evidence of an input-change-output process and determine the extent to inputs result in different outputs, or not.
Applying this definition to the information environment, joint doctrine should reflect two dimensions of information: the physical and the psychological. The physical dimension is matter-centric. This definitional specificity and broadening allow for various sensors to determine what’s “tangible” and what’s not. If it’s sensed, it’s tangible. The psychological dimension is sentient actor and data-centric. This broadening accounts for AI and biological actors, and hybrids thereof. An information environment consisting of matter, sentient actors, and data is also falsifiable. We can specify inputs, change and outputs and look to disprove as well as confirm their existence.
Therefore we can test and assess the effects of that information. For instance we can sense: (1) if there is matter, or not (we do this well, as we focus on the physical); (2) if there are sentient actors, or not (we can do this, but we ignore relevant actors and how they process information); (3) if there is data, or not (we can do this well, but we ignore psychological data that other actors do not). In this conception of the information environment, it’s possible that matter, sentient actors and data can behave as inputs, change or outputs.
We can apply these definitions and concepts to a strategic systems perspective of the operational environment, using a continuum of cooperation and confrontation. This continuum replaces the separate continua of competition and conflict. While political leaders determine the policy restraints, strategists can informatize operations by specifying desired effects in terms of inputs, changes and outputs. In any given snapshot, effects are outputs; however, they become inputs to other processes in time, whether planned or not.
This strategic process of managing ends, ways and means at tactical, operational and strategic levels (values) of significance is a continuous cycle of cooperation and confrontation. The consideration of both physical and psychological causes and effects is what informatizes operations. This can be done to the extent leaders recognize that operations need to gain information advantage. The bottom line is, operations is information.
Every armed service has its mantra of identity expressed in its real mission statement, its motto. Except for the US Marine Corps, they all claim a domain. Boots on the Ground. Control of the Seas. Always Faithful (Semper Fidelis). Air Superiority. Always Above (Semper Supra). Service identities are quick to point out the dependence of other services on their prime functions as well. Such as, if we lose the war in the air, we lose the war and we lose it quickly (Field Marshal Bernard L. Montgomery). Despite the shared oaths to the US Constitution, none of these identities inculcate the primacy of information in warfare.
In the not so distant past, ideological Vietnamese nationalists combined physical and psychological effects to exhaust the US in the information environment. A generation later in a different cultural context, ideological Taliban fundamentalists began their strategy, which just achieved the same result.
Given the values of our republic, those charged with ensuring its security need to define and place information in the lead of operations.
If we lose the war of information, we lose every war over time.
[1] See Charles J. Dunlap, “Practitioners and the Law of War Manual“ in Michael A. Newton, ed., The United States Department of Defense Law of War Manual, Cambridge University Press, 2019, pp. 70-74).
[2] Carl Von Clausewitz, On War, Michael Howard and Peter Paret, eds., Princeton University Press, 1976, p. 119.
[3] See chapter 6, “Talking while Fighting,” in Lien-Hang T. Nguyen, Hanoi’s War: An International History of the War for Peaace in Vietnam, University of North Carolina Press, 2012.
[4] For examples of friedenkampf, see chapter 19, “Peace-War,” in Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare, Farrar, Straus and Giroux, 2020.
[5] Qiao Laing and Wang Xiangsui, Unrestricted Warfare, 1999, https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/September-October-2019/Precis-Unrestricted-Warfare/.
[6] Kristin Ven Bruusgaard, “Russian Strategic Deterrence,” Survival Vol 58 Issue 4, 2016, pp. 7-26.