An insight into the Russian propaganda model and solutions to it

A study by the RAND institute – Christopher Paul and Miriam Matthews

About this article

Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia’s approach to propaganda. The country has effectively employed new dissemination channels and messages in support of its 2014 annexation of the Crimean peninsula, its ongoing involvement in the conflicts in Ukraine and Syria, and its antagonism of NATO allies. Like a “firehose of falsehood,” the Russian propaganda model is high-volume and multichannel, and it disseminates messages without regard for the truth. It is also rapid, continuous, and repetitive, and it lacks commitment to consistency. These techniques would seem to run counter to the received wisdom for successful information campaigns, but research in psychology supports many of the most successful aspects of the model. Furthermore, the very factors that make the firehose of falsehood effective also make it difficult to counter. Traditional counterpropaganda approaches will likely be inadequate in this context. More effective solutions can be found in the same psychology literature that explains the surprising success of the Russian propaganda model and its messages.

This perspective was sponsored by the Combating Terrorism Technical Support Office and produced in the International Security and Defense Policy Center of the RAND National Defense Research Institute, a federally funded research and development center sponsored by the Office of the Secretary of Defense, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community.


Since its 2008 incursion into Georgia (if not before), there has been a remarkable evolution in Russia’s approach to propaganda. This new approach was on full display during the country’s 2014 annexation of the Crimean peninsula. It continues to be demonstrated in support of ongoing conflicts in Ukraine and Syria and in pursuit of nefarious and long-term goals in Russia’s “near abroad” and against NATO allies.

In some ways, the current Russian approach to propaganda builds on Soviet Cold War–era techniques, with an emphasis on obfuscation and on getting targets to act in the interests of the propagandist without realizing that they have done so. In other ways, it is completely new and driven by the characteristics of the contemporary information environment. Russia has taken advantage of technology and available media in ways that would have been inconceivable during the Cold War. Its tools and channels now include the Internet, social media, and the evolving landscape of professional and amateur journalism and media outlets.

We characterize the contemporary Russian model for propaganda as “the firehose of falsehood” because of two of its distinctive features: high numbers of channels and messages and a shameless willingness to disseminate partial truths or outright fictions. In the words of one observer, “[N]ew Russian propaganda entertains, confuses and overwhelms the audience.”

Contemporary Russian propaganda has at least two other distinctive features. It is also rapid, continuous, and repetitive, and it lacks commitment to consistency.

Interestingly, several of these features run directly counter to the conventional wisdom on effective influence and communication from government or defense sources, which traditionally emphasize the importance of truth, credibility, and the avoidance of contradiction. Despite ignoring these traditional principles, Russia seems to have enjoyed some success under its contemporary propaganda model, either through more direct persuasion and influence or by engaging in obfuscation, confusion, and the disruption or diminution of truthful reporting and messaging.

We offer several possible explanations for the effectiveness of Russia’s firehose of falsehood. Our observations draw from a concise, but not exhaustive, review of the literature on influence and persuasion, as well as experimental research from the field of psychology. We explore the four identified features of the Russian propaganda model and show how and under what circumstances each might contribute to effectiveness. Many successful aspects of Russian propaganda have surprising foundations in the psychology literature, so we conclude with a brief discussion of possible approaches from the same field for responding to or competing with such an approach.

Russian propaganda is High-volume and Multichannel

Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. This propaganda includes text, video, audio, and still imagery propagated via the Internet, social media, satellite television, and traditional radio and television broadcasting. The producers and disseminators include a substantial force of paid Internet “trolls” who also often attack or undermine views or information that runs counter to Russian themes, doing so through online chat rooms, discussion forums, and comments sections on news and other websites. Radio Free Europe/Radio Liberty reports that “there are thousands of fake accounts on Twitter, Facebook, Live Journal, and v Kontakte” maintained by Russian propagandists. According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.

RT (formerly Russia Today) is one of Russia’s primary multimedia news providers. With a budget of more than $300 million per year, it broadcasts in English, French, German, Spanish, Russian, and several Eastern European languages. The channel is particularly popular online, where it claims more than a billion page views. If true, that would make it the most-watched news source on the Internet. In addition to acknowledged Russian sources like RT, there are dozens of proxy news sites presenting Russian propaganda, but with their affiliation with Russia disguised or downplayed. Experimental research shows that, to achieve success in disseminating propaganda, the variety of sources matters:

  • Multiple sources are more persuasive than a single source, especially if those sources contain different arguments that point to the same conclusion.
  • Receiving the same or similar message from multiple sources is more persuasive.
  • People assume that information from multiple sources is likely to be based on different perspectives and is thus worth greater consideration.

The number and volume of sources also matter:

  • Endorsement by a large number of users boosts consumer trust, reliance, and confidence in the information, often with little attention paid to the credibility of those making the endorsements.
  • When consumer interest is low, the persuasiveness of a message can depend more on the number of arguments supporting it than on the quality of those arguments.

Finally, the views of others matter, especially if the message comes from a source that shares characteristics with the recipient:

  • Communications from groups to which the recipient belongs are more likely to be perceived as credible. The same applies when the source is perceived as similar to the recipient. If a propaganda channel is (or purports to be) from a group the recipient identifies with, it is more likely to be persuasive.
  • Credibility can be social; that is, people are more likely to perceive a source as credible if others perceive the source as credible. This effect is even stronger when there is not enough information available to assess the trustworthiness of the source.
  • When information volume is low, recipients tend to favor experts, but when information volume is high, recipients tend to favor information from other users.
  • In online forums, comments attacking a proponent’s expertise or trustworthiness diminish credibility and decrease the likelihood that readers will take action based on what they have read.

The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive. Quantity does indeed have a quality all its own. High volume can deliver other benefits that are relevant in the Russian propaganda context. First, high volume can consume the attention and other available bandwidth of potential audiences, drowning out competing messages. Second, high volume can overwhelm competing messages in a flood of disagreement. Third, multiple channels increase the chances that target audiences are exposed to the message. Fourth, receiving a message via multiple modes and from multiple sources increases the message’s perceived credibility, especially if a disseminating source is one with which an audience member identifies.

Russian propaganda is rapid, continuous, and repetitive

Contemporary Russian propaganda is continuous and very responsive to events. Due to their lack of commitment to objective reality (discussed later), Russian propagandists do not need to wait to check facts or verify claims; they just disseminate an interpretation of emergent events that appears to best favor their themes and objectives. This allows them to be remarkably responsive and nimble, often broadcasting the first “news” of events (and, with similar frequency, the first news of nonevents, or things that have not actually happened). They will also repeat and recycle disinformation. The January 14, 2016, edition of Weekly Disinformation Review reported the reemergence of several previously debunked Russian propaganda stories, including that Polish President Andrzej Duda was insisting that Ukraine return former Polish territory, that Islamic State fighters were joining pro-Ukrainian forces, and that there was a Western-backed coup in Kiev, Ukraine’s capital.

Sometimes, Russian propaganda is picked up and rebroadcast by legitimate news outlets; more frequently, social media repeats the themes, messages, or falsehoods introduced by one of Russia’s many dissemination channels. For example, German news sources rebroadcast Russian disinformation about atrocities in Ukraine in early 2014, and Russian disinformation about EU plans to deny visas to young Ukrainian men was repeated with such frequency in Ukrainian media that the Ukrainian general staff felt compelled to post a rebuttal.

The experimental psychology literature tells us that first impressions are very resilient: An individual is more likely to accept the first information received on a topic and then favor this information when faced with conflicting messages. Furthermore, repetition leads to familiarity, and familiarity leads to acceptance:

  • Repeated exposure to a statement has been shown to increase its acceptance as true.
  • The “illusory truth effect” is well documented, whereby people rate statements as more truthful, valid, and believable when they have encountered those statements previously than when they are new statements.
  • When people are less interested in a topic, they are more likely to accept familiarity brought about by repetition as an indicator that the information (repeated to the point of familiarity) is correct.
  • When processing information, consumers may save time and energy by using a frequency heuristic, that is, favoring information they have heard more frequently.
  • Even with preposterous stories and urban legends, those who have heard them multiple times are more likely to believe that they are true.
  • If an individual is already familiar with an argument or claim (has seen it before, for example), they process it less carefully, often failing to discriminate weak arguments from strong arguments.

Russian propaganda has the agility to be first, which affords propagandists the opportunity to create the first impression. Then, the combination of high-volume, multichannel, and continuous messaging makes Russian themes more likely to be familiar to their audiences, which gives them a boost in terms of perceived credibility, expertise, and trustworthiness.

Russian Propaganda Makes No Commitment to Objective Reality

It may come as little surprise that the psychology literature supports the persuasive potential of high-volume, diverse channels and sources, along with rapidity and repetition. These aspects of Russian propaganda make intuitive sense. One would expect any influence effort to enjoy greater success if it is backed by a willingness to invest in additional volume and channels and if its architects find ways to increase the frequency and responsiveness of messages. This next characteristic, however, flies in the face of intuition and conventional wisdom, which can be paraphrased as “The truth always wins.”

Contemporary Russian propaganda makes little or no commitment to the truth. This is not to say that all of it is false. Quite the contrary: It often contains a significant fraction of the truth. Sometimes, however, events reported in Russian propaganda are wholly manufactured, like the 2014 social media campaign to create panic about an explosion and chemical plume in St. Mary’s Parish, Louisiana, that never happened. Russian propaganda has relied on manufactured evidence—often photographic. Some of these images are easily exposed as fake due to poor photo editing, such as discrepancies of scale, or the availability of the original (pre-altered) image. Russian propagandists have been caught hiring actors to portray victims of manufactured atrocities or crimes for news reports (as was the case when Viktoria Schmidt pretended to have been attacked by Syrian refugees in Germany for Russian’s Zvezda TV network), or faking on-scene news reporting (as shown in a leaked video in which “reporter” Maria Katasonova is revealed to be in a darkened room with explosion sounds playing in the background rather than on a battlefield in Donetsk when a light is switched on during the recording).

In addition to manufacturing information, Russian propagandists often manufacture sources. Russian news channels, such as RT and Sputnik News, are more like a blend of infotainment and disinformation than fact-checked journalism, though their formats intentionally take the appearance of proper news programs. Russian news channels and other forms of media also misquote credible sources or cite a more credible source as the origin of a selected falsehood. For example, RT stated that blogger Brown Moses (a staunch critic of Syria’s Assad regime whose real name is Eliot Higgins) had provided analysis of footage suggesting that chemical weapon attacks on August 21, 2013, had been perpetrated by Syrian rebels. In fact, Higgins’s analysis concluded that the Syrian governmentwas responsible for the attacks and that the footage had been faked to shift the blame. Similarly, several scholars and journalists, including Edward Lucas, Luke Harding, and Don Jensen, have reported that books that they did not write—and containing views clearly contrary to their own—had been published in Russian under their names. “The Kremlin’s spin machine wants to portray Russia as a besieged fortress surrounded by malevolent outsiders,” said Lucas of his misattributed volume, How the West Lost to Putin.

Why might this disinformation be effective? First, people are often cognitively lazy. Due to information overload (especially on the Internet), they use a number of different heuristics and shortcuts to determine whether new information is trustworthy. Second, people are often poor at discriminating true information from false information—or remembering that they have done so previously. The following are a few examples from the literature:

  • In a phenomenon known as the “sleeper effect,” low- credibility sources manifest greater persuasive impact with the passage of time. While people make initial assessments of the credibility of a source, in remembering, information is often dissociated from its source. Thus, information from a questionable source may be remembered as true, with the source forgotten.
  • Information that is initially assumed valid but is later retracted or proven false can continue to shape people’s memory and influence their reasoning.
  • Even when people are aware that some sources (such as political campaign rhetoric) have the potential to contain misinformation, they still show a poor ability to discriminate between information that is false and information that is correct.

Familiar themes or messages can be appealing even if these themes and messages are false. Information that connects with group identities or familiar narratives—or that arouses emotion—can be particularly persuasive. The literature describes the effects of this approach:

  • Someone is more likely to accept information when it is consistent with other messages that the person believes to be true.
  • People suffer from “confirmation bias”: They view news and opinions that confirm existing beliefs as more credible than other news and opinions, regardless of the quality of the arguments.
  • Someone who is already misinformed (that is, believes something that is not true) is less likely to accept evidence that goes against those misinformed beliefs.
  • People whose peer group is affected by an event are much more likely to accept conspiracy theories about that event.
  • Stories or accounts that create emotional arousal in the recipient (e.g., disgust, fear, happiness) are much more likely to be passed on, whether they are true or not.
  • Angry messages are more persuasive to angry audiences.

False statements are more likely to be accepted if backed by evidence, even if that evidence is false:

  • The presence of evidence can override the effects of source credibility on perceived veracity of statements.
  • In courtroom simulations, witnesses who provide more details—even trivial details—are judged to be more credible.

Finally, source credibility is often assessed based on “peripheral cues,” which may or may not conform to the reality of the situation. A broadcast that looks like a news broadcast, even if it is actually a propaganda broadcast, may be accorded the same degree of credibility as an actual news broadcast. Findings from the field of psychology show how peripheral cues can increase the credibility of propaganda:

  • Peripheral cues, such as the appearance of expertise or the format of information, lead people to accept—with little reflection—that the information comes from a credible source.
  • Expertise and trustworthiness are the two primary dimensions of credibility, and these qualities may be evaluated based on visual cues, such as format, appearance, or simple claims of expertise.
  • Online news sites are perceived as more credible than other online formats, regardless of the veracity of the content.

The Russian firehose of falsehood takes advantage of all five of these factors. A certain proportion of falsehood in Russian propaganda may just be accepted by audiences because they do not recognize it as false or because various cues lead them to assign it greater credibility than they should. This proportion actually increases over time, with people forgetting that they have rejected certain offered “facts.” The proportion of falsehoods accepted increases even more when the disinformation is consistent with narratives or preconceptions held by various audiences. Where evidence is presented or seemingly credible sources disseminate the falsehoods, the messages are even more likely to be accepted. This is why Russian faux-news propaganda channels, such as RT and Sputnik, are so insidious. Visually, they look like news programs, and the persons appearing on them are represented as journalists and experts, making audience members much more likely to ascribe credibility to the misinformation these sources are disseminating.

Russian Propaganda Is Not Committed to Consistency

The final distinctive characteristic of Russian propaganda is that it is not committed to consistency. First, different propaganda media do not necessarily broadcast the exact same themes or messages. Second, different channels do not necessarily broadcast the same account of contested events. Third, different channels or representatives show no fear of “changing their tune.” If one falsehood or misrepresentation is exposed or is not well received, the propagandists will discard it and move on to a new (though not necessarily. more plausible) explanation. One example of such behavior is the string of accounts offered for the downing of Malaysia Airlines Flight 17. Russian sources have offered numerous theories about how the aircraft came to be shot down and by whom, very few of which are plausible.27 Lack of commitment to consistency is also apparent in statements from Russian President Vladimir Putin. For example, he first denied that the “little green men” in Crimea were Russian soldiers but later admitted that they were. Similarly, he at first denied any desire to see Crimea join Russia, but then he admitted that that had been his plan all along.

Again, this flies in the face of the conventional wisdom on influence and persuasion. If sources are not consistent, how can they be credible? If they are not credible, how can they be influential? Research suggests that inconsistency can have deleterious effects on persuasion—for example, when recipients make an effort to scrutinize inconsistent messages from the same source. However, the literature in experimental psychology also shows that audiences can overlook contradictions under certain circumstances:

  • Contradictions can prompt a desire to understand why a shift in opinion or messages occurred. When a seemingly strong argument for a shift is provided or assumed (e.g., more thought is given or more information is obtained), the new message can have a greater persuasive impact.
  • When a source appears to have considered different perspectives, consumer attitudinal confidence is greater. A source who changes his or her opinion or message may be perceived as having given greater consideration to the topic, thereby influencing recipient confidence in the newest message.

Potential losses in credibility due to inconsistency are potentially offset by synergies with other characteristics of contemporary propaganda. As noted earlier in the discussion of multiple channels, the presentation of multiple arguments by multiple sources is more persuasive than either the presentation of multiple arguments by one source or the presentation of one argument by multiple sources. These losses can also be offset by peripheral cues that enforce perceptions of credibility, trustworthiness, or legitimacy. Even if a channel or individual propagandist changes accounts of events from one day to the next, viewers are likely to evaluate the credibility of the new account without giving too much weight to the prior, “mistaken” account, provided that there are peripheral cues suggesting the source is credible.

While the psychology literature suggests that the Russian propaganda enterprise suffers little when channels are inconsistent with each other, or when a single channel is internally inconsistent, it is unclear how inconsistency accumulates for a single prominent figure. While inconsistent accounts by different propagandist on RT, for example, might be excused as the views of different journalists or changes due to updated information, the fabrications of Vladimir Putin have been unambiguously attributed to him, which cannot be good for his personal credibility. Of course, perhaps many people have a low baseline expectation of the veracity of statements by politicians and world leaders. To the extent that this is the case, Putin’s fabrications, though more than the routine, might be perceived as just more of what is expected from politicians in general and might not constrain his future influence potential.

What Can Be Done to Counter the Firehose of Falsehood?

Experimental research in psychology suggests that the features of the contemporary Russian propaganda model have the potential to be highly effective. Even those features that run counter to conventional wisdom on effective influence (e.g., the importance of veracity and consistency) receive some support in the literature.

If the Russian approach to propaganda is effective, then what can be done about it? We conclude with a few thoughts about how NATO, the United States, or other opponents of the firehose of falsehood might better compete. The first step is to recognize that this is a nontrivial challenge. Indeed, the very factors that make the firehose of falsehood effective also make it quite difficult to counter: For example, the high volume and multitude of channels for Russian propaganda offer proportionately limited yield if one channel is taken off the air (or offline) or if a single misleading voice is discredited. The persuasive benefits that Russian propagandists gain from presenting the first version of events (which then must be dislodged by true accounts at much greater effort) could be removed if the true accounts were instead presented first. But while credible and professional journalists are still checking their facts, the Russian firehose of falsehood is already flowing: It takes less time to make up facts than it does to verify them.

We are not optimistic about the effectiveness of traditional counterpropaganda efforts. Certainly, some effort must be made to point out falsehoods and inconsistencies, but the same psychological evidence that shows how falsehood and inconsistency gain traction also tells us that retractions and refutations are seldom effective. Especially after a significant amount of time has passed, people will have trouble recalling which information they have received is the disinformation and which is the truth. Put simply, our first suggestion is don’t expect to counter the firehose of falsehood with the squirt gun of truth.

To the extent that efforts to directly counter or refute Russian propaganda are necessary, there are some best practices available — also drawn from the field of psychology—that can and should be employed. Three factors have been shown to increase the (limited) effectiveness of retractions and refutations: (1) warnings at the time of initial exposure to misinformation, (2) repetition of the retraction or refutation, and (3) corrections that provide an alternative story to help fill the resulting gap in understanding when false “facts” are removed.

Forewarning is perhaps more effective than retractions or refutation of propaganda that has already been received. The research suggests two possible avenues:

  • Propagandists gain advantage by offering the first impression, which is hard to overcome. If, however, potential audiences have already been primed with correct information, the disinformation finds itself in the same role as a retraction or refutation: disadvantaged relative to what is already known.
  • When people resist persuasion or influence, that act reinforces their preexisting beliefs. It may be more productive to highlight the ways in which Russian propagandists attempt to manipulate audiences, rather than fighting the specific manipulations.

In practice, getting in front of misinformation and raising awareness of misinformation might involve more robust and more widely publicized efforts to “out” Russian propaganda sources and the nature of their efforts. Alternatively, it could take the form of sanctions, fines, or other barriers against the practice of propaganda under the guise of journalism. The UK communications regulator, Ofcom, has sanctioned RT for biased or misleading programs, but more is needed. Our second suggestion is to find ways to help put raincoats on those at whom the firehose of falsehood is being directed.

Another possibility is to focus on countering the effects of Russian propaganda, rather than the propaganda itself. The propagandists are working to accomplish something. The goal may be a change in attitudes, behaviors, or both. Identify those desired effects and then work to counter the effects that run contrary to your goals. For example, suppose the goal of a set of Russian propaganda products is to undermine the willingness of citizens in NATO countries to respond to Russian aggression. Rather than trying to block, refute, or undermine the propaganda, focus instead on countering its objective. This could be accomplished through efforts to, for example, boost support for a response to Russian aggression, promote solidarity and identity with threatened NATO partners, or reaffirm international commitments.

Thinking about the problem in this way leads to several positive developments. It encourages prioritization: Do not worry so much about countering propaganda that contributes to effects that are not of concern. This view also opens up the aperture. Rather than just trying to counter disinformation with other information, it might be possible to thwart desired effects with other capabilities—or to simply apply information efforts to redirecting behaviors or attitudes without ever directly engaging with the propaganda. That leads to our third suggestion: Don’t direct your flow of information directly back at the firehose of falsehood; instead, point your stream at whatever the firehose is aimed at, and try to push that audience in more productive directions.

That metaphor and mindset leads us to our fourth suggestion for responding to Russian propaganda: Compete! If Russian propaganda aims to achieve certain effects, it can be countered by preventing or diminishing those effects. Yet, the tools of the Russian propagandists may not be available due to resource constraints or policy, legal, or ethical barriers. Although it may be difficult or impossible to directly refute Russian propaganda, both NATO and the United States have a range of capabilities to inform, influence, and persuade selected target audiences. Increase the flow of persuasive information and start to compete, seeking to generate effects that support U.S. and NATO objectives.

Our fifth and final suggestion for addressing the challenge of Russian propaganda is to use various technical means to turn off (or turn down) the flow. If the firehose of falsehood is being employed as part of active hostilities, or if counterpropaganda efforts escalate to include the use of a wider range of information warfare capabilities, then jamming, corrupting, degrading, destroying, usurping, or otherwise interfering with the ability of the propagandists to broadcast and disseminate their messages could diminish the impact of their efforts. Anything from aggressive enforcement of terms of service agreements with Internet providers and social media services to electronic warfare or cyberspace operations could lower the volume—and the impact—of Russian propaganda.

About the authors

Christopher Paul is a senior social scientist at RAND and a professor at the Pardee RAND Graduate School. He is also an adjunct faculty member in the Center for Economic Development in the Heinz College at Carnegie Mellon University. He focuses on developing methodological competencies for comparative historical and case-study approaches, evaluation research, various forms of quantitative analysis, and survey research. He has published on such topics as insurgency and counterinsurgency, building international partner capacity, and information operations and strategic communication.

Miriam Matthews is a behavioral and social scientist at RAND and a professor at the Pardee RAND Graduate School. She conducts research in the areas of political psychology, international conflict, and diversity and multiculturalism. She has published on the factors that contribute to negative intergroup attitudes, the influence of acculturation ideologies, the effects of threats on political attitudes, and the origins of support for anti-Western jihad.

Copyright © 2019 The Middle East and North Africa Media Monitor.