Misc, Politics, Social Media

The Good, the Bad, and… The Bad Wins! Bad polling travels fast and wide

Screen Shot 2018-03-21 at 7.05.31 PM

By our count, 34 polls were published in the past week that were conducted by reputable pollsters like Pew and Gallup. We’ll have more to say soon about how we classify a pollster as “reputable,” but for now, trust us!

We’re starting a project to track how reputable (and disreputable) survey results are shared and discussed on social media. Are good polls ignored?  Are “bad” polls widely circulated in echo-chambers of the like-minded?  Here’s our first cut at what can happen.

We start by identifying a “best practices” survey conducted by Pew, released on March 15th, showing that the public remained confident in Robert Mueller’s investigation; 61% of Americans said they were “somewhat” or “very” confident that the investigation into Russian meddling was being conducted fairly. This survey – which is sound in methodology, transparent, and straightforward — was barely discussed online. It was referenced thrice by news outlets and barely discussed on Twitter; at the time of our publication of our findings, we found just fifty three mentions of that polling result.

Conversely, at around 10:00 PM on March 17th, Drudge Report tweeted out the link to a non-probability poll asking if Mueller should be fired or not. At the time of this post, 76% said they thought President Trump should fire Mueller; a mere 24% did not. Amazingly, over 702,300 responses have been recorded.  Further, since the poll went live, it was  tweeted about a whopping 17,218 times. In the same timeframe, 22,805 tweets have been sent in total containing the terms “Mueller” and “poll” together.

To the untrained eye, Drudge’s poll may seem more representative of the public because of the high volume of responses. It is not.  First, the Drudge poll is an “opt-in” survey where only those who visit the website can answer the question. Since respondents were not chosen based on criteria that allows us to know how well they stand for the larger population, we simply can’t know if they do.  Worse, we can expect response bias  to affect the results because the average visitor to Drudge’s website is politically conservative, according to research conducted by the PEW.

Despite these two glaring red flags, The Hill published an article about the results that ignored these flaws.  Nor does The Hill mention that the poll, embedded on Drudge Report’s home page, allowed users to, in theory, record their responses multiple time (either by clearing cookies, opening a different browser, or waiting a few hours).

Given the flaws of an opt-in “poll”, it is unsurprising that many tweets about it come from “#MAGA” and “#Resist” accounts encouraging their followers to “flip” the poll in their favor. The Hill’s reporting about it is therefore irresponsible.

As we build on our research into how surveys are discussed on Twitter, we hope to develop a “naming and shaming” tool/technique to discourage media like The Hill from confusing the public about reputable and disreputable surveys.

Leave a Reply