Assistant editor and Opinions contributor

(Sergio Peçanha/The Washington Post)

(Sergio Peçanha/The Washington Post)

(Sergio Peçanha/The Washington Post)

(Sergio Peçanha/The Washington Post)

(Sergio Peçanha/The Washington Post)

There’s a bright side to the Russian government’s attempt to interfere in our elections: Because many Americans are now highly concerned about how misinformation on social media can impact our democracy, scientists are calling for a new research agenda to fully understand the threat.

But it isn’t only deliberate human wrongdoing that needs our attention; the structure of social media itself may be part of the problem. As researchers dive deeper into the caverns of our digital world, they should also evaluate whether these sites are exacerbating bad human behavior that is causing the breakdown of our political system.

As an illustration of the potential threat, consider a new paper published in the journal Nature last week. Authors of the paper recruited more than 2,500 people to take part in online games to see how social networks can affect their decision-making.

Participants were broken into groups of 24 people, which were then split into yellow and purple teams. In each game, the players were given four minutes to vote on which team will “win.”

Participants were split into teams

Purple team

Yellow team

Each rectangle represents one study participant.

Participants were split into teams

Purple team

Yellow team

Each rectangle represents one study participant.

Participants were split into teams

Purple team

Yellow team

Each square represents one study participant.

If your team wins a supermajority of the vote (60 percent), you and your teammates each get a $2 reward. If the opposing team gets 60 percent of the vote, you and your teammates get 50 cents. But if neither team gets 60 percent, nobody gets anything.

In other words, the players have an incentive to “compromise” so that one team wins. But there’s a catch: Participants can only communicate with five other players at a time. You might, for example, be able to see how two of your teammates and three of your opponents intend to vote. Or you might see how four of your teammates and none of your opponents will vote.

The result is what the authors of the paper call “information gerrymandering,” because depending on who players communicate with, they may have a different impression of what’s likely to happen. And they will make very different decisions as a result.

When the players are sorted so that each person on both teams communicates with an equal number of their teammates and opponents, they will likely understand that the teams are evenly matched. They’ll vote strategically, making deadlock less likely.

A test without gerrymandering

People in each subgroup could communicate.

Outcome

Both teams would negotiate to win equally, and were less likely to end in deadlock.

A test without gerrymandering

People in each subgroup could communicate.

Outcome

Both teams would negotiate to win equally, and were less likely to end in deadlock.

A test without gerrymandering

People in each subgroup could communicate.

Outcome

Both teams would negotiate to win equally, and were less likely to end in deadlock.

You could design a game, however, that skews the network in favor of one party. This takes advantage of the same techniques as gerrymandering in our political maps: clustering half of one team’s members together so that they only communicate with themselves, and then breaking up the other half so that they mostly communicate with people from the other team. Because the disadvantaged players will feel outnumbered, they’ll likely switch their vote to avoid deadlock. In some cases, such “gerrymandering” resulted in the favored team winning almost 70 percent of the vote.

A highly gerrymandered test

In these subgroups, yellow players dominated.

The remaining people in the purple team could only communicate among themselves.

Outcome

The yellow team had the advantage and a deadlock was less likely.

A highly gerrymandered test

In these subgroups, yellow players dominated.

The remaining people in the purple team could only communicate among themselves.

Outcome

The yellow team had the advantage and a deadlock was less likely.

A highly gerrymandered test

In these subgroups, yellow players dominated.

The remaining people in the purple team could only communicate among themselves.

Outcome

The yellow team had the advantage and a deadlock was less likely.

A third design skews the network, but equally to both parties. In this case, almost all the players are communicating with people on the same team, giving them little reason to vote for their opponents. As a result, they almost always end up deadlocking.

A symmetrically gerrymandered test

Outcome

Neither team has an advantage and a deadlock is highly likely.

A symmetrically gerrymandered test

Outcome

Neither team has an advantage and a deadlock is highly likely.

A symmetrically gerrymandered test

Outcome

Neither team has an advantage and a deadlock is highly likely.

As the theory goes, something similar might be happening on social media, but on a much larger, and more consequential, scale: Those who are predominantly exposed to people who agree with them dig in to their positions. And when both sides are siloed, the likelihood of compromise in our political system plummets.

Of course, outcomes in the real world aren’t as straightforward as in the simulations; countless factors go into the decisions we make in our elections. But there is evidence that our information networks online are skewed in a similar way to the simulations.

Authors of the Nature study pulled data for online information networks and found that conservative blogs were more likely to link to other conservative blogs than their liberal peers. Conservative Twitter users were also more likely to retweet or mention other conservatives. And unsurprisingly, left-leaning news media articles were more likely to link to other left-leaning sites than right-leaning articles.

We still don’t have enough data to definitively understand the effect that such information “gerrymandering” has on our political discourse. But it’s not difficult to imagine that it could be accelerating political polarization.

“It could have a huge impact on our behavior,” said Alexander Stewart, assistant professor at the University of Houston and co-author of the study. “And we could measure it — if Facebook gave us the data to do it.”

None of this is to say that we should spend any less time studying the effects of misinformation. Indeed, misinformation campaigns could heighten the effect of information gerrymandering, widening what the authors of the paper called an “influence gap.” They attempted to replicate this in some of the games by introducing “propaganda bots,” which would only vote for their own team. When both teams had such bots on their side, the result was total deadlock.

And what happens in the real world? We don’t know, but we should demand answers, or at least social media companies should let scientists figure out those answers independently. The health of our democracy itself could be at stake.

Additional work by Sergio Peçanha.