The head of Facebook's News Feed defended the company's two-question survey aimed at cutting down on the spread of fake news in a rolling series of tweets this week after critics argued that the survey could be manipulated and would fail to accurately gauge the quality of news outlets.
According to Facebook's Adam Mosseri, Facebook will sample a new set of users each day and use only those responses in the company's evaluations of trustworthiness. It's unclear how many users will be surveyed and what other information will be pulled into the assessment.
“I'm sure some bad actors will try to game the system, but it's not as easy as you suggest,” Mosseri responded to one Twitter user who criticized the survey.
The Facebook survey, which is part of the crowdsourcing initiative that Facebook chief executive Mark Zuckerberg unveiled last week, consists of two short questions with a choice of responses, according to a report by BuzzFeed News:
Do you recognize the following websites
How much do you trust each of these domains?
Not at all
Facebook has not disclosed how or where those questions would appear to users. The company declined to comment beyond Mosseri's tweets.
Some people were startled by the simplicity of the survey. They also noted the potential for it to reward partisan news outlets with loyal audiences or punish niche outlets and start-ups. But Mosseri tweeted that the survey was designed to recognize news outlets with broad recognition and trust among their users.
More than two-thirds of Americans get some of their news from social media, according to the Pew Research Center. That shift has empowered companies such as Facebook and Google, but it has also placed them in the challenging position of trying to decide what news they should distribute to global audiences.
Rasmus Nielsen, the director of research at the Reuters Institute for the Study of Journalism at the University of Oxford, said that the criticism directed at the survey may reflect a degree of elitism, as research shows surveys can produce meaningful signals of trust. But that doesn't mean Facebook's approach will be successful, he added. “Everything hinges on how this is incorporated with other signals and other sources of data,” he said.
Nielsen also expressed reservations about Facebook's wording in the survey questions. He said the company should consider additional responses that allow users to express no opinion or simply, “I don't know.”
Dean Eckles, a professor at MIT's Sloan School of Management and former data scientist at Facebook, said that it's likely the company would base the results on a combination of the survey results, behavioral data and other aggregate information. He also noted that Facebook's move to survey only a subset of users, and not everyone, is a safeguard against attempts to undermine the integrity of the results.
“Survey samples are hard to game in a lot of ways because you'd need a lot of dedicated people in order to game a survey,” Eckles said.
Even as Mosseri offered some insight into Facebook's thinking, some experts questioned the piecemeal release of information and the opacity of the company's potential changes to how users will consume news media online.
“There is a lack of transparency around how they are rolling out this process, and what forms of data they are drawing on,” said Daniel Kreiss, a professor at the School of Media and Journalism at the University of North Carolina at Chapel Hill. “It seems like a weird way to do it.”