On Thursday, the Wall Street Journal reported that Michael Cohen, President Trump’s former personal attorney, tried to rig online “polls” in favor of Trump. The story isn’t the latest nefarious revelation to emerge from Cohen’s downfall. But the scandal here is less that Cohen tried to manipulate online popularity contests, and more that such surveys get away with masquerading as polls in the first place.

According to the Journal, Cohen paid RedFinch Solutions owner John Gauger to write “a computer script to repeatedly vote for him” in a CNBC “poll” aimed at naming top U.S. business leaders. Later, he did the same thing for a Drudge Report “poll” gauging the strength of 2016 Republican presidential candidates.

I put the word “polls” in scare quotes here, because what the article describes isn’t really a poll. Real pollsters try to come up with results that are representative of some interesting group of people, whether it’s registered voters, all U.S. adults or likely GOP primary voters. There are a lot of different ways to get at this sort of representative sample: Some randomly call people on the phone, while others get a representative sample from a large Internet panel and often weight their data so that people who are harder to poll still get sufficient representation. The process of getting that right mix typically doesn’t include allowing Cohen and friends to join the sample, repeatedly vote and throw off the results.

AD
AD

And as Ariel Edwards-Levy points out, the people who read websites such as CNBC, the Drudge Report or frankly most websites (and the subset of readers who respond to these surveys) are probably different from the general public in important ways. If foxnews.com and msnbc.com let any reader vote on whether Trump should be reelected and just tallied up the results, they would likely come up with wildly different numbers. Moreover, I would guess that neither number would match the number produced by a real pollster who looked at, say, 1,000 registered voters. The same thing goes for CNBC and Drudge.

The large number of responses to these polls — Trump got almost 24,000 votes in the Drudge poll — doesn’t improve the accuracy of these “polls” for this exact reason. If you asked a randomly chosen, representative group of 1,000 U.S. adults who their favorite football team was, then asked 30,000 people from Green Bay, Wis., whom they preferred — well, you can guess which survey would be more accurate.

Put simply, these “polls” aren’t the same thing as the horse-race or presidential approval polls that are tracked on RealClearPolitics or FiveThirtyEight. I haven’t seen any evidence that CNBC or Drudge took any steps to get representative samples. It looks more like the “polls” were reader surveys where anyone can vote. Knowing that CNBC readers who opt into a survey about presidential candidates are inclined to vote a certain way might tell you something about a segment of CNBC’s audience, but it doesn’t reveal much about the general electorate, and it certainly isn’t a useful way to predict how an election will turn out.

AD
AD

There’s nothing inherently wrong with doing a reader survey — I’m sure it’s a fun thing to do and that people click on it. But you shouldn’t think that these surveys reflect what the country actually thinks.

Fortunately, it’s easy not to get duped by fake poll results. If you see a poll result that looks weird, first check whether any methodological details are disclosed and whether those methodological details make sense. If the site is aboveboard about the fact that it’s doing a just-for-fun poll of its readers, you can relax and not take the results seriously. If you’re looking at a horse-race poll, a presidential approval survey or one of those commonly asked questions, check whether the poll is on FiveThirtyEight, RealClearPolitics or HuffPost Pollster. Each site has different standards/methodologies, but they’re all dedicated to filtering out truly fake polls. If the survey was conducted by a well-known pollster with a strong track record, that’s a great sign. And other outlets have had good guidelines on this, too.

If you go through some combination of these steps, you have a pretty good chance of spotting this sort of “poll” and not letting bad numbers get the best of you. The real scandal here is that Cohen could have saved some money if he had done the same due diligence.

Read more:

AD
AD