As a follow-on to China’s strategy, we show how Russia’s use of narrative reorients decisions in an Observe-Orient-Decide-Act (OODA) Loop. The distortion of information is not just divisive. It envelops the “when-deterrence-fails” US approach to warfare.
Official Russia’s purpose in weaponizing information is to divide audiences, and in multiple ways. The divisiveness works psychologically by Persuading (P), Compelling (Cp), Dissuading (Ds) and Deterring (Dt) all the time, at the same time. As we will demonstrate, P Cp Ds Dt envelops a narrower US conception of warfare oblivious to information effects. Russia’s narrative creates its combined effect by reorienting audiences’ decisions to attribute negative intent to what is observed (real and contrived).
Against physical strategies of precision destruction, such narratives can maintain many aspects of initiative such as tempo, momentum, learning, decision, position, and freedom of maneuver. Narratives do this by influencing the will and capability of its targets.
Narratives‘ success does not mean greater lethality does not matter. It does mean that the information effects of lethality matter. When information is being contested and lethal capabilities are not, information can be, divisively effective. How?
The means of an influential narrative are both cooperative and confrontational (dual), not just one or the other (binary). This duality is effective across diverse contexts. Globally and locally, there are ample opportunities to assure as well as intimidate will; and to enhance as well as neutralize capability. This flexibility to divide creates relative weakness. From the perspective of the current Russian Federation regime, that is supposed to enhance national status and respect.
Once again we use Colonel John Boyd’s OODA Loop—Observe, Orient, Decide, and Act— because it is an accepted decision-making process in both public and private sectors. Its strength is in making fast decisions in contested environments. Faster, however, is not necessarily smarter. Mis-, dis-, and mal-information distort perceptions.
Our focus is on narrative warfare that strategically arranges such information. In manipulative blends of cooperation and confrontation, Russia’s narratives work psychologically. They persuade and compel, and they dissuade and deter, behavior. After demonstrating Russian narrative strategy, we offer recommendations on how to defeat it.
To do this, we use a language of Cooperation and Confrontation drawn from combined effects strategy. As depicted in FIGURE 1, the the framework encompasses Psychological Means and Physical Means to Prevent Action and to Cause Action. These actions are the strategy’s effects, ranging from “Dissuade” in the upper left to ”Coerce” in the lower right.
In practice, the three distinctions (cooperation–confrontation; psychological–physical; preventive–causative) are spectra of differences. That is, in the real world, we see blends of both end points. We can use these simple distinctions to characterize and shape a much more complex information environment. As we shall see in our concluding section, psychological and physical means can create synergistic combinations of effects.
Russia’s narrative warfare tries to detonate more effects on rivals than rivals can do to Russia. In contrast to China’s single-Party narrative, Russia’s is fragmented and does not offer a better future. So instead of fantasizing a Putin version of Xi’s carefully arranged Chinese Dream, Russian narratives spew information to cultivate confusion. Why?
Internally, Moscow does not have the control over its population that Beijing manages to exert. To split authoritarian hairs, Russia is more of a pseudo-democracy than China. Corruption complicates the flow of information. Putin’s targeting of individuals to influence via organized crime compares to Xi’s ”anti-corruption” probes used to purge rivals and leverage syndicates in state-owned enterprises. With less state filtering of information and mass surveillance than in China, Russians can voice more discontent. In response, Russia’s “digital authoritarianism“ has become flexible to control a connected and engaged civil society. After anti-government protests in 2011, state-run information outlets joined the stream of non-attributed cyber attacks and troll & bot pollution.
Externally, pro-Russia criminals exploit grievances of marginalized Russian speakers in former Soviet republics. Russophones residing in free states after the USSR’s demise are more vulnerable than China’s diaspora that fled the PRC’s rise. The Putin regime views territory from the Baltics to Georgia as breakaway states. From one extreme Eurasianist perspective, the rest of the Caucasus and Central Asian front is Russian space. That message loses purchase in the 80% of Russian territory east of the Urals, where only 25% of Russia’s population lives. Nordic states are resistant to Russian influence domestically but subject to Russian military activity in the Barents Sea.
That leaves Russian enclaves along Russia’s western front to Ukraine as the most receptive to pro-Russia narratives. Estonia, Latvia, Lithuania, Moldova, Byelorus and Ukraine have Russian-speaking minorities whose multiple identities can be exploited. Moscow’s divisive drive regain control is a response to newly independent states joining NATO to be free of authoritarian domination. Because there are Russians in Russia who are active in domestic politics, the regime can stir up patriotism to deflect democratic reform and incite unrest abroad.
Russia’s production of distracting information is extensive. Kompromat, active measures, and external political interference combine with direct and proxy aggression, as in Georgia (2008) and Ukraine (2014). The COVID-19 pandemic is only the latest opportunity for Russia to deploy disinformation for strategic purposes.
To understand how arranging information can impact reorient decision-making, we turn to narrative warfare in an OODA Loop.
The following is a synopsis of the OODA Loop and Narrative Strategy discussion from our previous paper. Both OODA and narrative strategy seek to create advantage in an environment.
John Boyd’s OODA Loop began as a fighter pilot’s technique to shoot down enemy aircraft before they did the same to him. OODA had to be fast, accurate, and holistic enough to anticipate changes in the environment:
The Observe and Orient phases are key because they condition the Decide and Act phases—what to do and what not to do. Fixating on what’s observed or not having a focus at all distorts orientation, which is our mental perception of what’s going on. Many factors go into orientation to figure out what the observations mean to us. This creation of a context is where a narrative enters to shape meaning.
Narrative strategy is “changing the way power works” via narrative warfare, strategic influence, terrorism and insurgency, violent extremism, radicalization, information warfare, and social media. Dr Ajit Maan‘s recent book, Plato’s Fear, includes a four-step model of how extremist narratives appropriate their audiences’ experiences to frame meaning. That in turn influences decisions and behavior. We apply Maan’s model as weaponizing a narrative.
Maan’s Narrative Weaponization Model
A useful way to categorize the selective evidence, faulty cause-and-effect, artificial dichotomies, and false analogies in the Narrative Weaponization Model is types of mis- and dis-information:
Note that the above distinctions are about content, context and connections. Therefore a systems approach to the information environment can account for these differences. The OODA Loop is an open system because Observation includes outside information and unfolding interaction with the environment.
Now we take a look at some data from Hamilton 2.0 to illustrate Russia’s narrative and test it as a weaponized narrative. Another analytical source with which to corroborate this information is EU vs Disinfo.
There is an important difference between what we are about to do as critical thinkers, and what the Prescription above seeks to do to its victims.
We are keeping our observations (the chronology of official Russia information) separate from our orientation (Maan’s Narrative Weaponization Model). With the exception of the chronology, the Narrative Weaponization Model’s components are part of our analytical orientation. Why do we do this?
We take the chronology as observations, not orientation, because our object of analysis is a narrative. Narratives that are not chronological would not fit this definition. This analytical separation allows us to test the fit of the model based on the evidence of what is observed. The italicized words provide the evidence of weaponization, according to this model.
We don’t have to infer Russia’s strategic narrative, thanks to ample studies and evidence. Russia’s narrative is a barrage of multiple messages that EUvsDisinfo describes in terms of five types: elites versus the people; threatened values; lost sovereignty or threatened national identity; threatened collapse; and ”hahapropaganda.“ The latter involves joking about evidence. This corresponds to the satire or parody type of mis- and dis-information identified by rcmediafreedom.eu.
From this credible description, we can present the overall narrative in the following terms. Any state, company, segment of a population, or individual may be inserted into the blank, depending on the desired target.
______ is an elitist, incompetent threat to stable Eurasian values and secure identities
Next, we look at this narrative for fit as a weapon in terms of Hamilton 2.0’s top 10 Tweets by Retweets (as of 15 April 2020) from official Russian sources, as well as follow on activities.
Next, we checked the Top 10 Tweets by Likes, only one of which duplicates the Top 10 Tweets by ReTweets:
Here are the Broadcasts uploaded to RT America and RT UK, 59 in all, from official Russia on 20 April 2020:
Russia’s narrative as chronology permits inexhaustible insertions of selective evidence, and faulty cause and effect. Selective evidence promotes faulty cause-and-effect thinking. Our snapshot sample begins with observations that set up subsequent conclusions about Turkey, the US, and Ukraine.
First, the Istanbul Mayor’s criticism of President Erdogan’s surprise imposition of a curfew can be tied to later observations that “prove” the government is out of touch with its people: Kocaeli youth fleeing an apparent police presence and the Interior Minister’s offer of resignation rejected by the President. Adding that sea traffic is halted by the ongoing crisis of COVID-19 amplifies a message of incompetent elitism.
Second, graffiti in Chicago blaming capitalism for spreading the coronavirus can be connected to a subsequent observation that conveys selfism capitalism: President Trump announcing the halt of funding to the WHO.
Third, the fire extinguished in Chernobyl, Ukraine, is providing Russia’s state-run media such as RT and Sputnik opportunities to reveal contradictions and coverups from Ukrainian state officials.
Overall a chronological sequence that excludes counter-evidence simplifies cause and effect circularity. The result is, believed truth.
Artificial dichotomies and false analogies reinforce the faulty cause-and-effect thinking above. There seldom is an example of Russia doing some good, without a rival doing lots of bad. The openly debated ways that NATO and its partners deal with COVID-19 is manipulated as incompetence and self-interest. By avoiding negative intent with respect to Russia, a misleading contrast implies that Russia does better. To sharpen the contrast, ”the entire ecosystem of Russian disinformation” fabricates content. Furthermore, false context can use genuine information made available by cyber theft from Advanced Persistent Threat 28 (APT28).
Timely re-description of the narrative at memorable moments and in vulnerable locations can expand the narrative. A calendar of events for each target may be constructed. In February, for instance, two days offer multiple opportunities to Russia.
February 23d, Defender of the Fatherland Day, offers multiple opportunities. This year, reenactment of a World War II battle in the border town of Ivangorod physically demonstrated Russia’s commitment to Russian-speaking peoples. Psychologically, the event re-described that commitment to the city of Narva on Estonia’s side of the border, which is 80% ethnic Russian and only 48% Estonian by citizenship. Russia’s narratives consistently describe those who express non-Russian identities as Russophobes (see Disinfo Review for notorious examples each week).
February 18th, on the last day of the Battle of Debaltseve, Ukrainian forces were forced to withdraw against more and better equipped Russian forces. Russians went on to assume control of over 7% of Ukraine territory. This year, Russian-backed proxies assaulted Ukrainian positions, three days before the 5th anniversary of the Minsk II Agreement that called for a ceasefire. The periodic physical escalation maintains Russia’s foothold inside Ukraine to strengthen its voice against NATO or EU integration.
How well the narrative becomes internalized among various groups requires knowledge of contexts. In Ivongorod and Narva, Russian media reportedly is consumed mostly by older people, while the youth view local websites. In Ukraine, Russia’s re-description of its seizure of territory as something else — protecting Russians in eastern Ukraine — helps maintain an active armed conflict. That rules out NATO membership.
Taking responsibility to blame others is a straightforward matter in the preceding examples of Turkey, the US, and Ukraine. Maintaining the relative blame away from Russia is more nuanced, depending on the aforementioned local context. The importance of internalization to Russian narrative strategies depends on the usefulness of the targeted population for various issues.
For instance, that the Erdogan government is distrusted by the Turkish people for the moment is useful to making Erdogan more susceptible to Putin’s influence regarding Turkish and Russian operations in Syria. A longer-term internalization of pro-Russia feeling in Turkey is neither necessary nor realistic anyway.
With respect to dividing US opinion via polarized narratives that highlight partisanship (e.g., Lee Camp on Sanders endorsing Biden), that’s enough to draw attention to a divided NATO. Internalizing domestic or international blame is useful for Russia even if Russia receives much of the blame. What is important for Russia is that the divisiveness proliferates Russia’s narrative.
The case of Ukraine, and the Baltics, is where internalization of blame can generate lasting effects using the same issue. Territorial disputes are not easily forgotten. So, getting local segments of the population to care about domestic corruption and incompetence is feasible in non-NATO countries where there are territorial issues. Believing that Russia is less corrupt and more competent is problematic, but irrelevant if enough blame can be heaped on the targeted state and if security assistance is not credible. Russian President Putin continues to blame Ukrainian President Zelensky for failure to implement Minsk II, even as Russia wages proxy war as the “serial violator” of that agreement.
The increased uncertainty of information during the COVID-19 pandemic can be exploited to increase targeted audiences’ identification with some of the many negative aspects of Russia’s narrative. Audiences that are less accustomed to faulty information may be more vulnerable to arguments of an anti-Russia conspiracy. Some NATO members such as Latvian leaders pride themselves in educating the public on pro-Russia media bias.
The presence and absence of active journalism can be exploited. Journalism that scrutinizes government-provided information promotes more disclosure of relevant information. That information can be bent with false context, content and connections. Without active journalism, narratives have less competition in getting people to identify themselves as pro-Russia/anti-NATO-EU and to take action.
Finally, exaggerations can reinforce identifying with the pro-Russia narrative. Such as, “Russia Bringing Masks and Gloves to Estonia, USA Bringing Javelin Missiles,” despite that Estonia paid for the protective equipment from China. Add to this blame that the pandemic will be persistent, and that targeted actors are incompetent to handle the problem. That messaging expands opportunities to exploit subsequent mal, mis and disinformation.
Overall, Russia’s information spewage fits the Narrative Weaponization Model, but not for the same purpose as in China’s case. China’s purpose is zero-sum, as in an OODA Loop applied to a life-or-death dogfight. You must buy into China’s narrative, or China’s narrative will buy you out. There is no particular reason to join China’s narrative, and no acceptable alternative to joining China’s relationships on China’s terms. The narrative justifies itself. As explained in Paper #23, this is the subversive purpose of China’s narrative: to replace critical orientation with ideological belief. Observe & Orient fit the narrative; Decide & Act are compliant.
In Russia’s case, the purpose of weaponizing a narrative is to distort the decision-making Loop, rather than to collapse it into a belief (China’s case). We discern this intent in each stage of the model and in the impact on OODA decision-making. The impact is to Persuade and Compel, and Dissuade and Deter. Our final section details how this combined effect works and recommends how to defeat it.
Our overall recommendation is to understand how actors combine superior effects in the information environment. This conclusion is based on how narratives influence decision cycles to produce psychological effects.
Russia’s narrative works on a target audience through a ”distort and influence” mechanism:
From this information attack, official Russia seeks to persuade and compel, and dissuade and deter behavior. As mentioned in our introduction, the P Cp Ds Dt combined effect is powerful and flexible across many contexts. How does the narrative influence will and capability this way?
Let’s look at the logic of targeting will and capability. This basic logic applies to most audiences. Referring to FIGURE 1’s set of effects, the narrative can:
The following examples fit several of our samples. The first portion is Russia’s message to the target. The parenthetical portion is Russian’s desired effect, sometimes stated and sometimes not:
The above examples of P Cp Ds Dt, when combined, can be synergistic compared to a rival’s fragmented or partial efforts. Dissuade and Persuade are complementary. Deter is necessary to keep those two effects working below NATO’s threshold of significant response. Induce adds to all three preceding effects.
There are, of course, examples of P Cp Ds Dt that contradict one another. Especially If these effects are not planned to be combined, but are instead the consequence of combined arms focused on military effects. Add to that self-inflicted weakness, an unfortunate US policy assumption: warfare starts when deterrence fails.
We hear this sound byte from senior leaders time after time after time. Like a re-description. Our synthesis of OODA Loop decision making, narrative warfare, and combined effects strategy reveals the narrowness of this predominant assumption. The cost of such obliviousness to information warfare is strategic defeat.
To prevent that, we need to be in the arena.
We close with a practical list of recommendations to defeat Russia’s disinformation:
The fact that slaughter is a horrifying spectacle must make us take war more seriously, but not provide an excuse for gradually blunting our swords in the name of humanity. Sooner or later someone will come along with a sharp sword and hack off our arms.
Carl von Clausewitz, On War, Book IV, Chapter XI.
The fact that disinformation pervades society must make us take information warfare more seriously, but not provide an excuse for relegating the battle of ideas to kinetic victory in the illusion that war is only violence. Sooner or later someone will come along with a powerful narrative and erase our values.
Thomas A. Drohan, International Center for Security and Leadership Paper #24