When is a Poll Really a Poll?

Interpreting Polls Legacy blog posts Polls in the News

Reader PW emailed yesterday with a question about a poll that appeared on Tuesday in the blogosphere.  "Is there anything you can quickly glean from this that would suggest
that it is obviously phony  . . . Or perhaps real?"  The general question — how can one tell when a leaked poll is real? — is a good one.  Expect it to come up again and again during the 2006 campaign.  Let’s take a closer look.

Unfortunately, the nature of the dissemination of polling data allows for the possibility of shenanigans by the ethically challenged.  It is not hard to find examples of "overzealous" partisans who have leaked highly misleading or even fictional results.  How can an ordinary consumer of polling data tell the difference?  MP suggests three rules of thumb:

1) Has the pollster gone "on the record" about the results?   We have all seen stories like the following:  It’s the final weekend before Election Day and a campaign spokesperson is spinning their chances for success.  Invariably, they cite "internal polling" that shows their campaign surging ahead, holding their lead, etc.  How many times have we heard such a statement only to see a totally different result a few days later when all the votes are counted? 

Spin happens.  A good rule of thumb in these situations is to consider whether the pollster that conducted the survey is willing to put their reputation on the line and release and discuss the actual results?  Or is a campaign spokesperson or unnamed source simply characterizing unspecified "internal polling."  MP puts little faith in the latter. 

Of course, private campaign pollsters (like MP) often release results when they show good news for our clients.  Such releases typically follow an unwritten Washington convention:  We prepare a one or two-page memorandum on our company letterhead that summarizes the most favorable highlights of the survey and let our clients distribute the memo as they see fit.  Most statewide and congressional campaigns now typically send such memos to National Journal’s Hotline ($$) which routinely includes the  results in their daily news summary. 

Bottom line:  look for an official release, an attribution to the polling firm from a mainstream news source or an on-the-record quotation from the pollster. 

2) Does the pollster disclose their basic methodology? According to the principles of disclosure put out by the National Council of Public Polls (NCPP),** public survey reports should include the following:

  • Sponsorship of the survey;
  • Dates of interviewing;
  • Method of obtaining the interviews (in-person, telephone or mail);
  • Population that was sampled;
  • Size of the sample;
  • Size and description of the sub-sample, if the survey report relies primarily on less than the total sample;
  • Complete wording of questions upon which the release is based; and
  • The percentages upon which conclusions are based.

Most legitimate reports — including the memoranda released by internal campaign pollsters — will meet the NCPP standards for disclosure.  If the survey report does not include this information, take it with a huge grain of salt. 

3) Does the pollster go beyond the NCPP disclosure standards?  This rule may be of little practical help for ordinary readers, since very few pollsters report more than the basics.  Nonetheless, MP hopes that pollsters will begin to go beyond the sensible NCPP standards and that reporters will begin to ask tougher questions about how polls are done.  Specifically:

  • The sampling frame — did the pollster sample all telephone households (random digit dial, RDD) or use some sort of a list (such as a list of registered voters)?
  • What weighting procedures, if any, were used? 
  • What was the full text and order of all questions released, including all questions that preceded those on which results were based (questions that may have created a response bias)? 
  • What filter questions, if any, were used to screen to the population of interest?
  • If the pollster reported results of "likely voters," how were such voters defined and selected? 
  • What is the demographic composition for the weighted and unweighted samples?  If results are based upon a subsample, what is the demographic composition of that subsample?
  • What was the response rate for the survey? 

Now, MP assumes that other pollsters will question the wisdom of routinely releasing such information, but the point here is simple:  If reporters or readers are in doubt about whether a poll is genuine, they can tell a lot from the pollster’s willingness and ability to disclose this level of detail.  Conversely, if a pollster is not willing to disclose information on such items as the sample frame, the composition of the sample, the way they defined likely voters, the text of screening questions or those preceding the questions of interest, reporters and readers should be highly skeptical. 

An important caveat:  These rules of thumb are only useful in distinguishing real surveys from spin.  Judging the quality of survey is more difficult.  For example, a pollster may disclose every last detail of their methodology, but if they do not begin with a probability sample (in which every member of the population has an equal or known chance of being included) the results are questionable.  (Judging survey quality is a very big subject, but readers may want to consult the suggestions of various pollsters as gathered by The Hotline and posted by MP back in April). 

Let’s consider the two examples that readers brought to MP’s attention in the last few days.

The first, a poll of Ohio showing results for various match-ups in the 2006 governor’s race, came to MPs attention via reader PW.  Posted on the blog Polipundit, the poll purportedly showed Democrat Ted Strickland running ahead of all four Republicans tested (Betty Montgomery, Ken Blackwell, Jim Petro and John Kasich), while Democrat Michael Coleman ran ahead of Montgomery and Blackwell but essentially even with Petro and Kasich.  On Thursday, DailyKos posted the same results in virtually identical format, attributed only to "a trusted source" though cautioning readers to "take with appropriate grain of salt."

While the numbers are interesting, this particular "release" fails every one of MPs rules of thumb.  The two blog items tell us nothing about who sponsored or conducted the poll and virtually nothing about how the survey was conducted.  Survey dates?  Question text?  Sample frame (adults, registered voters, likely voters)?  Who knows?  The sample size specified is also a bit odd — 501 Republicans, 501 independents and some unknown number of Democrats.  We know nothing about how these separate "samples" were combined.  Moreover, I can find no reference to this survey in any mainstream media source, including The Hotline. 

Thus, readers should be very, very skeptical about this "poll." I cannot say that the results look "obviously phony," and it seems odd that a conservative blogger like Polipundit would blindly pass along such negative results about the Ohio GOP.  However, we have virtually nothing to reassure us that the poll is real. Without some attribution, MP would not place much faith in it. 

DailyKos posted results from another Ohio poll that at first glance appears more legitimate.  This one, "leaked" by a "DC source," showed a surprisingly close race in a theoretical U.S. Senate match-up between Republican incumbent Mike DeWine and Democratic Congressman Sherrod Brown.  The Kos item tells us that the poll was conducted for the Democratic Senatorial Campaign Committee (DSCC) by well known Democratic pollster Diane Feldman.  It specifies the total number of interviews (1,209) and verbatim text and results from three questions.  Oddly, it specifies a single date (6/27) which may be a release date or the final night of interviewing rather than the complete field period (MP knows Feldman well enough to doubt she would attempt to complete 1200 interviews in a single evening).  On the whole this report meets many (though not all) of the NCPP disclosure standards.  So far, so good.

But remember rule of thumb #1.  Is Feldman quoted on the record anywhere?  Do we have a release on Feldman or DSCC letterhead?  No.  Also, consider that if the DSCC had put out an official release, it would have appeared in The Hotline this week.  They have not yet published any such poll.  For whatever reason, neither the sponsor nor the pollster has chosen to confirm these numbers for the record.  Until that happens, readers should treat these results with caution.  We may not know the full story.

**UPDATE (10/17/2007): The code of professional ethics of the American
Association for Public Opinion Research (AAPOR)
offers similar disclosure
standards
that now appear on their web site along with a helpful set of frequently asked questions about
those standards.

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.