In Thursday’s New York Times, Ed Finn says that we worry too much about rogue machines. He worries that science fiction narratives about “killer robots” are bad for us. In his description, “this is a dangerous way to think about A.I., because the stories we tell influence the decisions we make about how such systems should operate.”
But is this true? Do these kinds of science fictional analogies and stories have consequences for how we think about serious policy issues? As it happens, we have new research on just this question – and it turns out the answers are more complicated.
Our policy debates already draw on science fiction tropes.
We’re not the first to ask whether science fiction change how citizens understand politics and the world? Daniel and Musgrave’s new research, reported in the Monkey Cage, explains how this could be so. When we consume popular fiction — whether through movies or TV shows or novels — our brain experiences “synthetic experiences” that subsequently train us to think in slightly different ways. And some academic writings suggest that such understandings, when they enter the public discourse and get deployed by political actors, can enable political action more broadly. But this “enabling effect” has not really been tested, so we conducted a survey experiment to see whether this in fact happened. In our new paper, published in International Studies Quarterly, we ask: “Does Science Fiction Influence Political Fact?” And if so, how? Here’s what we found.
To find out how science fiction might affect public argument, we looked at the same political dispute mentioned in Finn’s article: the policy debate over whether to ban autonomous weapons. It is true that political activists opposed to these weapons use images from popular culture to punch home their arguments: campaigners in favor of a treaty ban refer to the weapons as “killer robots,” aiming to attract public attention to a complex issue. Also, the media has run with this trope, using images of the T-2000 on recent news articles about prospects for a UN treaty. But their opponents, who favor the development of such weapons, accuse them of using pop cultural notions to “scaremonger” the public into an unreasonable fear of emerging technologies From this perspective, which Finn seems to share, the public’s concerns about autonomous weapons are a product of dystopian pop culture rather than political reality.
Both sides in this debate assume they know how Americans react to the insertion of such pop cultural themes into political discourse. But surprisingly, until now there has been little real foreign policy research on whether political narratives in fiction have any effect on public opinion or actual foreign policy.
Pop culture has consequences, but it’s complicated
We found that the effects of killer robot images were more complicated than people might realize. It has consequences — but only when people are already steeped in science fiction. To figure this out, we asked a representative sample of 1,000 Americans which of a rash of hit Hollywood movies / franchises they had seen — including several films and TV shows featuring “killer robots.” We asked half the group these questions before we asked them about their attitudes toward autonomous weapons, and half we asked after.
This allowed us to test whether attitudes changed when autonomous weapons were referred to as “killer robots.” An earlier exploratory analysis found that people did not oppose autonomous weapons when they were framed in this way. We found the same thing, even when we applied statistical techniques to figure out if there was some other factor that was influencing public attitudes. This means that it’s unlikely that campaigners’ framing of autonomous weapons as killer robots is responsible for the high levels of public opposition to these weapons. Second, we tested whether being “primed” with sci-fi — asked about the science fiction franchises before being asked about autonomous weapons — made a difference. Again, we found no significant effect.
Where we did find an important effect was in sci-fi consumption. People who reported that they had seen a lot of sci-fi films were more likely to oppose trends in autonomous weaponry. Their reaction was even stronger if they were “primed” by getting the sci-fi questions in the survey first. However, this only happened with the most fluent science-fiction buffs in our sample, not for the overall population.
What this suggests is that pop culture may sometimes matter — but not in the simple way that foreign policy analysts might expect. People vary in how they interpret the messages in film and TV shows. If our results are right, killer robot fiction doesn’t have any significant effect on the general population. There is, however, a strong effect among the subculture of science fiction fans. The fact that they are only a small subset of the American population may explain why only a few people in our open-ended survey refer to science fiction as a reason to fear the outsourcing of kill decisions to machines. Instead respondents emphasize a variety of philosophical, normative and tactical fears, which are described in the wordcloud below:
This leads us to think that the widespread anti-killer-robot sentiment among Americans is driven more by legitimate fears over the weapons themselves than by activist’s scaremongering language. While references to “‘Terminator” may be ubiquitous in popular culture, for these narratives to inform foreign policy attitudes, one needs a deep immersion in science fiction, not just a superficial exposure to killer robot memes.
More broadly, it may be that pop culture doesn’t actually have much of an impact on actual foreign policy attitudes. Movies about evil artificial intelligence are certainly fun to watch, and even to write about. However, it isn’t clear that they have large-scale consequences for the way that people think about technology. They may influence the ideas of fans like Finn (and us!) but they don’t have measurable effects on the rest of the population. And if the U.S. population is nervous about outsourcing kill decisions to machines, it might have to do with something other than killer robot movies.
Charli Carpenter is a professor of political science at the University of Massachusetts at Amherst.
Kevin L. Young is an associate professor of political science at the University of Massachusetts at Amherst.