Since their inception, video games have served as a source of moral panic and a convenient scapegoat for acts of spectacular violence. But placing the blame on video games only allows us to avoid reckoning with the deeper roots of violence and grappling with a broken culture.
Video games had their earliest successful commercial releases in the early 1970s, a cultural moment abounding with angst about changing economic realities, shifts in gender roles and a sense of moral and societal breakdown. Broader economic and moral anxieties became linked with the new medium of the video game as adults watched children dumping money into unfamiliar machines and saw a potential explanation for supposed social ills. Violent games seemed especially pernicious.
Beginning with the release of chase-and-crash game “Death Race” in 1976, therefore, moral guardians ranging from PTA presidents to politicians have sounded the alarm about video games. In “Death Race” players drive on-screen pixilated cars over stick figures representing either monsters or pedestrians, depending on the account you believe. The then-new interactive nature of the video game medium made this gruesome gameplay seem particularly insidious. Journalists, safety experts and parents worried that the game would inspire real-world vehicular violence.
No such wave of car crimes materialized — yet its perceived threat did drive sales for the upstart arcade company. Exidy, the producer of the game, ultimately did so well with “Death Race” that it later courted controversy as a purposeful sales strategy, including with “Chiller,” a 1986 light-gun game that invited the player to perform graphic torture.
Thanks to the controversy and moral panic provoked by such games, local governments attempted to regulate video games and access to them throughout their golden age in the 1970s and 1980s. Communities imposed licensing schemes, age limitations and outright bans, some of which lasted into the 21st century. In 1982, the Supreme Court even ruled on Mesquite, Texas’s, efforts to curtail the spread of arcades. Through a convoluted set of decisions, the city ultimately lost out to plaintiff Aladdin’s Castle, which allowed children to enter arcades without adults. In that case as in the much later 2011 Brown v. Entertainment Merchants Association, the court ruled against age-based restrictions on access as an arbitrary violation of due process and free speech principles.
Further efforts to constrain video games followed. As the quality of graphics improved, games like “Night Trap” and “Mortal Kombat” led to 1993 congressional hearings on video games. The hearings precipitated the establishment of what we now know as the Entertainment Software Ratings Board to rate video games. With an industry invested in technological innovation and increasing photorealism and graphics, video games provided continuous cause for fresh concern, as new media technologies were implemented and everyone from parents to legislators worried about the realistic nature of the violence in them.
These concerns have intensified when real life violence has resembled virtual violence. Often, politicians have used video games as a convenient scapegoat in these moments. The Columbine High School massacre in 1999 ushered in an era of stunning, deadly mass shootings. In its wake, pundits pointed to the perpetrators’ affinity for games like “Doom” and “Quake,” overlooking other factors like the shooters’ admiration for Nazis.
Sometimes shooters themselves blame games, as in a 2003 incident in which teenage stepbrothers shot rifles at passing cars; the boys said they decided to do so based on “Grand Theft Auto III.” In cases like this where games appear to have inspired some individuals to behave badly, they tended to do so more in the vein of providing ideas for those looking to lash out rather than in compelling otherwise unlikely action.
And yet, this reality is often lost in the wake of shootings. With alarming frequency, media accounts detail how a shooter liked this or that violent video game. The shooter played “Doom.” The shooter played “Quake.” The shooter played “Call of Duty” or “World of Warcraft” or “Candy Crush.” But these accounts mistake correlation with causation. Instead of fretting about video games provoking mass violence, we should understand that this simply reflects the reality in 2019 when the majority of people in the U.S. play video games, now as accessible as an app on a phone.
Yet, while video games are equally popular in other places, the U.S. stands alone in the frequency and severity of mass shootings. In other countries with both more media regulation and more gun regulation, politicians rarely blame games for acts of violence. But in the United States, this is a common practice.
And it’s not merely misguided; it also obscures a deeper understanding of what causes these all too frequent tragedies. Video games are but one part of a larger culture in which men — and it is, disproportionately, young men — become mass shooters.
The shooters themselves often leave behind evidence of what they did and why — and we ignore them at our peril. In El Paso, the alleged shooter has been tied to a white supremacist missive that rails against a Hispanic “invasion”; in Charleston, the shooter wanted to start an anti-black race war; in Isla Vista, the shooter targeted women in a misogynistic rampage; in Pittsburgh, the alleged shooter chose a synagogue because of anti-Semitism and in particular his white supremacist anger that synagogue members were supporting immigrants and refugees, authorities say. It takes significant contortions to ignore the stated motivations scrawled in a flood of manifestos and mass murders and declare instead that video games are to blame.
People make this claim because they want a manageable solution to a complex problem and because they are unwilling to consider broader culpability that would implicate not one set of products, but our culture as a whole. Such assertions enable politicians to ignore the culpability of lax gun regulations, white supremacy, misogyny, anti-Semitism and other bigotries rooted deep in contemporary society.
Of course, video games are part of our national culture. Like other cultural products, they can shape and reflect the worst of us: misogyny and racism, bigotry and violence. Games reinforce and amplify dehumanizing rhetoric that tells us that women are things, that immigrants are monsters or leeches and other horrifying tropes that we should rightly recognize as violence in and of itself.
Games conveying these twisted messages shouldn’t surprise us either. After all, GamerGate, a 2014 backlash against criticism of sexism and racism in gaming culture, highlighted serious problems that have yet to be adequately addressed in video game culture. In many depressing ways, GamerGate wasn’t about video games — it seemed to anticipate many of the cultural tensions of the Trump era on and offline — but it did show the degree to which cultural products including games can serve as incubators for radicalization.
But, blaming video games for mass shootings divorces them from a larger, flawed cultural landscape and suggests that the problem is a single rotting fruit and not the unchecked growth of a poisoned tree.
To address the poisoned tree we must go beyond simply refuting Trump and others who seek to cast blame for mass violence on video games. It’s easy to point out how plenty of people play games and don’t commit violence, and or that many countries around the world have intensive gaming cultures and no mass shootings. But we must also examine and fix the underlying culture that shapes both games and the narrower video gaming culture. Until we grapple with the deep causes of our social ills, like racism, misogyny and other forms of bigotry, we are doomed to repeat them.