HF: People are familiar with “fear” based censorship, where people are punished harshly for saying things that a regime doesn’t like. Yet “friction” and “flooding” are becoming more important. How do they work?
MR: “Friction” and “flooding” are both methods of censorship that reroute users on the Internet without making it obvious that they are affected by censorship. “Friction,” or censorship through inconvenience, uses technology to make information more difficult to find by throttling or blocking websites, removing social media posts, or rearranging search results. Similarly, by “flooding,” or censoring through distraction, governments use armies of people or bots to overwhelm social media platforms with posts that distract from current events. The idea behind both friction and flooding is that because Internet users are for the most part impatient, making certain information slightly easier to access and others slightly more difficult can have a big effect on what users read.
HF: Your experiments suggest that obvious censorship of content available to ordinary members of the public can backfire. Why is this so?
MR: Generally, people don’t like being censored. Visible censorship signals that the government is trying to hide something and can therefore draw people toward the banned information rather than away from it, something commonly known as the “Streisand effect.” Obvious censorship can also incentivize people to find ways to circumvent censorship, a pattern that Will Hobbs and I describe in our recent paper on the Chinese block of Instagram, where many people who had not previously jumped the firewall chose to do so after the government blocked Instagram. Of course, visible censorship can and often does produce chilling effects, but the government has to weigh these effects with the possibility of backfire.
HF: Does this help explain why authoritarian governments are turning more toward friction and flooding?
MR: Yes. Before the Internet, censorship through fear could be targeted at journalists and the other few with enough power to speak out in the public sphere. Now anyone with Internet access has a potential megaphone, and this large population is much more difficult to credibly target with fear. So while the Chinese government still uses fear to control influential people like journalists and activists, for most of the public they are trying to use more invisible forms of censorship, like friction and flooding, to control their consumption of information.
HF: What kinds of information are authoritarian governments like China most likely to censor?
MR: Authoritarian regimes need support from at least a part of the public to maintain power. When these regimes lose legitimacy in the eyes of the public, it increases the likelihood for protests and resistance. People can use information to damage the regime both by maligning it and by coordinating protests against the government. In previous work with Jennifer Pan and Gary King, we found that the focus of censorship in China was often to suppress information about protests and collective action.
HF: Russia’s Internet Research Agency appears to have used flooding type attacks to try to shape politics in the U.S. and elsewhere. Are such attacks likely to be effective in democracies?
MR: Flooding is relatively more likely to be used in democracies because the use of fear and friction is constrained in many democracies by laws that protect freedom of speech. Flooding, on the other hand, takes advantage of the open concept of speech in democracies to distract from, confuse, and overpower alternative viewpoints. Research on flooding in the U.S. indicates that flooding can be effective in influencing what people read and share; however, whether flooding has influenced elections or individuals’ decisions on whether or how to vote is still a matter of debate among researchers.
This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the network is responsible for the article’s specific content. Other posts can be found here.