Facebook chief executive Mark Zuckerberg testifies on Capitol Hill in April. (Jacquelyn Martin/AP)

Nina Jankowicz researches Russian disinformation, technology and democracy as a Global Fellow at the Wilson Center’s Kennan Institute. She tweets @wiczipedia.

I am officially out of patience for Facebook and its “hard problems.”

Since the 2016 election, I’ve experienced several iterations of the company’s simpering smoke-and-mirrors show in the United States and Europe. Blue-and-white slide decks tout new transparency features. Company representatives earnestly address audience questions, folding their hands and nodding as if in a therapy session. “I hear your frustration,” they say. “These are hard problems, and we’re committed to solving them.”

It turns out — although I’m not sure anyone is shocked — that the events I’ve attended amount to one small glimpse of a much larger, darker lobbying operation meant to distract from Facebook’s failings and discredit its critics. Among the more shocking details in Wednesday’s New York Times report on the company’s efforts is that its executives not only cajoled lawmakers into avoiding regulation in traditional ways, but they also hired a Republican opposition research firm to disseminate claims that anti-Facebook protesters were paid by George Soros. The company was contributing to the degradation of the information ecosystem while publicly vowing to fight it.

For years we’ve been watching Facebook fail on foreign and domestic disinformation and miss the mark on hate speech and privacy protection. Americans have installed new leadership in the House of Representatives, opening up possibilities for new legislation. All this, along with the latest revelations from the Times, means that it’s time for a serious discussion of social media regulation in the United States.

Despite Facebook’s attempts to change the narrative, the company inadvertently demonstrated its carelessness on several occasions in the weeks running up to the midterm elections. On Oct. 30, Vice News managed to purchase ads purportedly paid for by 100 U.S. senators, Vice President Pence and even the Islamic State. This was a simple test of one of Facebook’s most lauded ad transparency features that should have never passed computerized, let alone human, quality control. Yet it was brushed off as a prank, an inconsequential loophole, not a new means to disinform.

As September’s access token breach showed, the company is still not a responsible steward of personal data. At least 30 million users’ accounts were compromised, and Facebook has remained silent about the identities of the perpetrators and the motivations behind the attack. And critically, Russian influence operations ravaged the platform until Election Day, despite Facebook’s crazed game of Whack-a-Troll and a late-breaking tip from the FBI. Today, the most sophisticated foreign disinformation operations likely continue unhindered, surreptitiously building trust and audience share before the presidential election in 2020.

While Facebook attempts to convince us that it is atoning for its sins (and sometimes using dubious methods to do so), the company is consistently violating the principle of “do no harm.” Facebook’s actions have shown that it believes that harm is okay — inevitable, even — as long as profits are up and the company can afford shady smear campaigns to distract from its mistakes.

The new Congress can provide the pressure Facebook needs to act. Even in the minority, Democrats expanded the public’s knowledge of Russian operations by releasing the more than 3,000 online ads bought by Russia’s Internet Research Agency. In the majority, they should press Facebook for information on organic engagements with foreign content as well as for more transparency surrounding the foreign-linked fake profiles that have interacted with activists across the political spectrum. These actions can build public awareness of online influence campaigns and help to depoliticize an issue that has been treated by both parties as a political bludgeon rather than a threat to democracy.

Though it’s unlikely social media regulations originating in the House will pass the Senate, Democrats should not underestimate the power of their oversight privileges or the effects of well-intentioned bills, even those that languish in committee. The Honest Ads Act, stuck in just such a limbo, has created public demand for transparency and changed how social media companies disclose online political advertising. The new Congress can help to sharpen our definition of concepts such as hate speech and discuss how platforms should treat false information. Lawmakers can also encourage platforms to help users understand the lengthy terms of service that govern the use of their platforms. More importantly, they can pressure the companies to actually enforce them, which they famously failed to do against Infowars.

Self-regulation has failed, and Facebook can no longer be trusted with it. We would not fly on an airline that lied about an appalling safety record, nor would the government allow it to operate. The new Congress presents an opportunity to have meaningful conversations about social media’s impact on our democracy and perhaps even change its course, “hard problems” be damned.

Read more:

The Post’s View: Facebook has started cracking down on political spamming. It shouldn’t stop there.

Mark Zuckerberg: Protecting democracy is an arms race. Here’s how Facebook can help.

Pawan Deshpande: Your Facebook data is still vulnerable. I know because I made it that way.

Christine Emba: Facebook isn’t too big to fail

Donald E. Graham: Don’t regulate Facebook