This blogger paid Facebook to promote his page. He got 80,000 bogus Likes instead.

February 10

If you manage a Facebook page for a company, interest or organization, then you know there are two main reasons Facebook Likes are important. One is pretty obvious: The number of Likes you receive is a numerical way of representing your relative popularity. The other is that Likes help your messages travel farther.

But what if that system has stopped working like it's supposed to? That's what Derek Muller, who hosts the weekly science podcast Veritasium on YouTube, suspects. Muller believes that click farms staffed by individuals in Egypt, India, the Phillippines, Pakistan and a host of other countries are spamming pages with fake Likes — and that Facebook is indirectly benefiting from it.

Muller believes these people roam the social network clicking every "Like" button they see, which winds up diluting the value of a Like and making it harder for page administrators to engage with their real users. In effect, admins that pay Facebook to have their pages promoted may be giving the company their money only to cause unwitting damage to their own brand.

Why would click farmers click so many "Like" buttons they weren't paid to click? Paying click farms directly to boost your follower count is banned. If Facebook catches someone operating a click farm, its accounts are closed. But Muller believes that some page administrators are nevertheless circumventing those rules, creating a market for legions of fake Facebook users that just click "Like" all day.

While generating Likes for their client, the click farms cover their tracks by clicking the Like buttons on many innocent pages, too. This, according to Muller, makes it harder for Facebook to detect attempts to game the system. And by virtue of having paid for a prominent position in people's newsfeeds, page administrators behaving legitimately often get swept up in the wave of fake Likes along with whatever real Likes being promoted may have drawn.

Here's a chart of Muller's findings, with each bubble standing for a country and the size of the bubble correlated to the number of Likes that country contributed to Muller's page. As he explains, a substantial number of his followers are from the same places where you'd find large click farms.


(Veritasium)

Here's how Muller says he knows the Likes are fake: Rather than being distributed across the engagement chart, all the countries with click farms fall to the bottom of the scale, indicating that those are followers who aren't receptive to the page's messaging. Those countries inside that red circle include (from the bottom): Egypt, India, the Philippines, Pakistan, Bangladesh, Indonesia, Nepal and Sri Lanka, all countries where click farms are common.

If Muller is right, that's a real problem for Facebook's customers. Muller used $50 of Facebook advertising credit on one of his own pages. Although his follower count rapidly grew, he claims he didn't see a corresponding uptick in engagement with his Facebook posts.

But it's actually worse than that: the fake followers can actually make it harder to reach your real ones. That's because when you post new content, Facebook places it on the newsfeeds of a fraction of your followers to test how interesting it is. If the response rate is high, Facebook shows it to more followers. If not, it doesn't. An army of fake followers can lower your response rate. And that, ironically, could force you to spend even more money on ads to reach users whom you could have reached organically if not for all those fake followers. This is good for Facebook's bottom line, but it's terrible for advertisers.

"I never bought fake Likes," Muller says in his video. "But the results are as if I had paid for fake Likes from a click farm."

Facebook didn't reply to a request for comment Monday.

Update: In a statement, a Facebook spokesperson said the company has worked hard to eliminate Like spam from the system:

Fake likes don’t help us. For the last two years, we have focused on proving that our ads drive business results and we have even updated our ads to focus more on driving business objectives. Those kinds of real-world results would not be possible with fake likes. In addition, we are continually improving the systems we have to monitor and remove fake likes from the system.

Just to be clear, he created a low quality Page about something a lot of people like – cats. He spent $10 and got 150 people who liked cats to like the Page. They may also like a lot of other Pages which does not mean that they are not real people – lots of real people like lots of things.

Brian Fung covers technology for The Washington Post, focusing on telecom, broadband and digital politics. Before joining the Post, he was the technology correspondent for National Journal and an associate editor at the Atlantic.
Comments
Show Comments
Most Read Business
Next Story
Brian Fung · February 10