A meme posted by an alt-right account on Twitter. The circular symbol to the left is an “occult” icon, often used by neo-Nazis. (Twitter)

Regardless of who triumphs at the ballot box, the biggest winner of this presidential election may be the alt-right: a sprawling coalition of reactionary conservatives who have lobbied to make the United States more “traditional,” more “populist” and more white.

Once relegated to the political fringes, the alt-right has become a sudden, shocking force in mainstream politics, closely identified with the Donald Trump campaign. Trump’s campaign chief executive, Stephen Bannon, is a former executive chairman of Breitbart News, which he once described as “the platform of the alt-right.” Trump regularly retweets the memes and messages of the alt-right, which has propelled the movement into the limelight.

But lurking behind the offensive tweets and racially charged campaign rhetoric, there’s a more subtle — and far more dangerous — potential threat posed by the alt-right. As my colleagues and I found during a large-scale analysis of alt-right Twitter activity over the past nine months, the movement is growing measurably more radical, and possibly more inclined to violence.

The alternative right has come under fire from Hillary Clinton and establishment Republicans, but it has been seeping into American politics for years as a far-right option for conservatives. Here's what you need to know about the alt-right movement. (Jenny Starrs/The Washington Post)

The radicalization of the alt-right

There are, of course, many factions in the alt-right, some of them more radical than others. We observe two primary groups within the alt-right’s extended Twitter network: garden-variety racists, who complain about mixed-race couples, are proud of their Scots-Irish heritage, and use hashtags such as “#WhiteWomenAreMagic,” and violent extremists, who call for genocide against Jews, the killing of Muslims and African Americans, and even threaten to lynch President Obama.

Disturbingly, the social media activity of these users suggests that more and more are transitioning into that second, violent group.

Using machine-learning algorithms to interpret the language in Twitter profile descriptions, and computer vision algorithms to identify pro-Nazi symbols in profile avatars, my colleagues at New Knowledge and I identified more than 3,500 radical extremists among the larger network of 27,000 accounts that are associated with the alt-right.

Swastikas dedicated by our computer vision algorithm, discovered in the alt-right network on Twitter. (Morgan via Twitter)

Many hundreds of users display the swastika, while others choose alternative symbols associated with hate groups, such as the Celtic cross, the Iron cross and the insignia of the Nazi paramilitary group Schutzstaffel, also known as the SS. Many others explicitly declare their allegiance to neo-Nazi and white separatist movements in the text of their profiles by proclaiming “white pride,” or explicitly identifying themselves as “white nationalists.”

Almost everyone in the alt-right network is an enthusiastic and vocal supporter of Trump, though the core group of extremists is more likely to mention their race, white nationalism and national socialism than any presidential candidate.

Using recent advances in machine-assisted text analysis, we quantified this racist, xenophobic, anti-Semitic and violent perspective based on the context in which authors use relevant keywords. For example, in typical English, like a mainstream newspaper article, the word “Jewish” is statistically similar to words such as “Muslim” and “Christian,” meaning that mainstream authors usually rely on the word “Jewish” to describe someone or something religious.


On the other hand, in tweets by white extremists, the word “Jewish” is used in a totally different context, where it is statistically similar to words such as“communist,” “homosexual,” “anti-white,” and “satanic.” White extremists are therefore more likely to use the word “Jewish” to signify something they hate, rather than as a religious description.


This is no surprise, but it provides an objective metric for understanding how the white extremist perspective diverges from the mainstream: Essentially, by analyzing the statistical use of the word “Jewish,” we can assign a given Twitter user a score that quantifies his ideological similarity to Twitter’s most violent, extreme alt-right users.

When the radicalization score is applied to tweets from the broader alt-right network, it’s clear that the entire white nationalist community is embracing an increasingly extreme ideology. The social media content of the alt-right in July was 25 percent more radicalized than it was in January, and the rate of radicalization is increasing exponentially.


Of course the alt-right is not a single group, but is composed of many sub-communities that have become radicalized at different rates and over different issues. Some communities, such as the #BlueHand” movement, relentlessly and aggressively promote Islamaphobia, whereas other communities rail against diversity, which they describe as #WhiteGenocide. Still others align themselves with neo-Nazis and engage in Holocaust denial  —  largely focused on a recent pro-Adolph Hitler documentary called “The Greatest Story Never Told ” —  while some instead choose the white supremacist groups with roots in the United States such as the Klu Klux Klan.

There is plenty of overlap between these communities, and almost everyone in the alt-right revels in bizarre conspiracy theories, such as the idea that President Obama founded the Islamic State  —  a theory recently made popular by Trump  —  or that Black Lives Matter activists are terrorists.

Looking more closely at one of these communities in particular, it’s possible to see the journey from casual racism to more extremist typically associated with violence.

This community of 5,225 users is tightly clustered inside the larger network, indicating a high degree of communication between its members. The tweets published by members of this community indicate a perspective that is 63 percent radicalized, and that has become increasingly radicalized over the past nine months.

A map of alt-right accounts on Twitter. (Morgan)

In January, the word “Jewish” hardly appears. When it does, the context reveals an undercurrent of casual, but not aggressive, racism.

By July, the tone has changed: The word “Jewish” appears in tweets from hundreds of accounts, and its usage implies a belief in large-scale conspiracy, racial antagonism and even explicit support for Hitler.

It’s never ‘just Twitter’

There’s a tendency, on both the right and the left, to dismiss these sorts of tweets as idle chatter or “trolling.” Writing in Breitbart in March, for instance, the alt-right icons Allum Bokhari and Milo Yiannopoulos described the movement’s most toxic messaging as “satire” and “mischief” aimed at generating outrage.

Recent experience shows us, however, that this interpretation could not be further from the truth. Individuals ideologically aligned with extremist white nationalists are responsible for repeated incidents of violence online and offline  —  including the high-profile hacking of comedian Leslie Jones’s website, the killing of a Lebanese man in Tulsa, the stabbing of a mix-raced couple in Olympia, Wash., and most horrifically, the mass shooting at a black church in Charleston, S.C. In fact, until the nightclub shootings in Orlando, white extremists had committed more attacks and killed more Americans than jihadist extremists since 9/11.

Incidentally, jihadist extremists provide a telling model for exactly how online “chatter” can turn into physical violence. That process, in which seemingly normal people become intoxicated with extremist ideology, is often referred to as the “path to radicalization,” and it is characterized by common vulnerabilities: Potential extremists feel ostracized from society, believe themselves to be victimized and are attracted to violence. Islamic State recruiters and propagandists exploit these vulnerabilities with narratives of strength and warmth, simultaneously empathizing with the alienated and disaffected while also promising power and belonging through righteous violence against oppressors.

Although the similarities are not immediately obvious, white, working-class communities also have become ostracized, disempowered and angry in the United States — making them vulnerable to radicalization. Described eloquently by author J.D. Vance in his lauded new book “Hillbilly Elegy,” these communities are at the center of a growing social and cultural crisis. They’ve been rocked by a dramatic uptick in divorce, rampant drug overdoses, rising rural death rates and a suicide epidemic. All this against a backdrop of increasing political irrelevance resulting from rural population decline and outright contempt from the wealthy.

As Steve Howard, the Imperial Wizard for the Mississippi KKK, told VICE in 2014: “In some ways we can relate to Islamic extremists, just like we are Christian extremists, because they’re fighting a holy war and so are we.”

How to de-radicalize a Twitter radical

By understanding these alt-right communities on Twitter, it may be possible to slow their march toward radicalization. For example, that community of 5,225 users — the one that has become 63 percent more radicalized since January — has adopted anti-Semitic rhetoric, but hasn’t adopted the language of extremist communities that openly advocate for violence. Targeted interventions with influential members of this group could be a promising model for reducing the overall amount of radicalization online.

Increasingly, experts agree, the more effective way to do this is to cultivate “counter narratives” that try to undermine the promises of radical ideologies – for instance, dispelling the myth of a utopic caliphate in Syria, or a white nationalist state in the United States. These, in turn, are most effective when they’re delivered by an “authentic voice” — someone who is already respected by the target extremist community. One promising recent effort, a partnership between Facebook, Twitter, Google, the Institute for Strategic Dialogue, and the nonprofit ExitUSA, persuaded at least eight people to leave the white supremacy movement after viewing videos from former supremacists who now reject extremism.

(Morgan) The 10 most influential accounts in one radicalized alt-right community. (Morgan)

Within Twitter’s radicalizing alt-right community, we identify those “authentic voices” by scouring the network for nodes, or users, who are unusually influential. (The 10 most important/influential have been colored red in the diagram below.) From there, we can identify important users who are less radicalized than is typical across their network.

User Starry Knight, for example, is socially conservative, religious, a staunch Trump supporter and a veteran. He enthusiastically highlights news articles covering crimes committed by illegal immigrants, and retweets content containing the #AltRight hashtag along with content accusing Democratic presidential nominee Hillary Clinton of “race baiting.” That said, while the “Knight” in his username may be a nod to the “knights” of the Klu Klux Klan, this user does not openly advocate for white supremacy, nor does he call for violence against minority groups.

(Morgan via Twitter)

Similarly, user “Locked&Loaded” is a staunch gun rights advocate, posting frequently about the Second Amendment, U.S. armed forces and law enforcement. She appears to be in a relationship with another vocal gun rights advocate who goes by the name “I M Lethal.” Nevertheless, like Starry Knight, she does not advocate violence and does not engage in hate speech. Her relentless support of an issue important to social conservatives may give her authenticity in the eyes of her community.

(Morgan via Twitter)

Of course, building these relationships is not easy and to many may be unpalatable. But the alternative to engagement is less palatable still: We may find ourselves watching idly as the alt-right slides further into violent extremism.

A version of this essay originally appeared on Medium.

Liked that? Try these: