The company will be closely monitoring how group administrators and moderators handle posts during those two months, and could decide to shut a group down completely if it repeatedly allows too many offending posts. The change makes the volunteers who run groups more responsible for what happens inside them.
“We are temporarily requiring admins and moderators of some political and social groups in the U.S. to approve all posts, if their group has a number of Community Standards violations from members,” said Facebook company spokesperson Leonard Lam. He said the company was taking the measure “in order to protect people during this unprecedented time.”
The new limitation follows other measures the company enacted this week to curb the viral spread of conspiracy theories and calls to violence over pending election results. Ahead of the election, Facebook announced it would temporarily stop political ads after polls closed, and it devised a label to use in case a presidential candidate prematurely claimed victory. It added that label to Trump posts earlier this week.
Since then, the company has been trying out new — temporary — tactics to keep up with a surge in disinformation and conspiracy theories. For example, Facebook said it would make it harder to find terms related to undermining the legitimacy of ballot counts, and reduce the distribution of election-related live videos.
Allies of President Trump have used Facebook pages and groups this week to spread a baseless conspiracy theory that Democrats are attempting to “steal” the election for Democratic nominee Joe Biden. Facebook took action Thursday, removing one of the largest groups pushing for in-person protests called “STOP THE STEAL,” which had 360,000 members. Facebook said it removed the group because of “worrying calls for violence” and attempts to delegitimize the election process.
Facebook had not publicly announced the stricter policies for groups, and it was unclear whether it is deploying or testing similar measures that are not yet public.
Some of the first groups to be put on Facebook’s watch list were caught off guard. Admins of a popular public group for the city of Aberdeen, Wash., found out they would have to approve each new post, effective immediately, via a Facebook notification. It said all posts “now require approval until Jan. 4.” The group, which mostly discusses local events, business and issues, has more than 7,000 members and a policy against arguing about politics or inciting other members.
“At this point we are wondering if we might close the group,” said Deb Blecha, a local graphic designer who has been an admin on the group for 10 years. “The extra work and the frustration is weighing heavy on us.”
Facebook and other social media companies have long relied on unpaid group administrators to handle the bulk of moderation for posts in their groups. In Facebook’s case, it uses a combination of artificial intelligence and professional content moderators to find extremely problematic content, but more nuanced decisions are left to volunteers. They police arguments, are a first line of defense against misinformation, and with this new measure, are held closely responsible for the kinds of conversations they allow.