Polling the Props: The Columbus Dispatch Poll

Initiative and Referenda Legacy blog posts

While we wait for election returns – yes, actual returns this time, not leaked exit polls – in the key off-year contests, MP finds himself coming back to the difficulty of conducting pre-election polling for ballot propositions and referenda.  And for all the different methodologies being tried in California, MP wonders why no one in California tried the back-to-the-future methodology of the Columbus Dispatch mail-in poll.

I wrote about the Dispatch poll just before the election last year, noting that an academic study published in Public Opinion Quarterly (POQ), the journal of academic survey methodology, found that the Dispatch poll reported an error rate of only 1.6 percentage points between 1980 and 1994, compared to error rates of 5 percentage points or higher for comparable telephone studies. 

This afternoon, while waiting for results to come in, MP re-read a companion study by the authors of the POQ piece (Visser, Krosnick, Marquette and Curtin, 2000).  The first reminds us of how, in Ohio from 1980 to 1996, polling on statewide issues was much less accurate than polling on other races: 

The statewide candidate races (average error = 1.5 percentage points) were predicted more accurately than were the other contests.  The average error for the local candidate races was 2.7 percentage points, as compared to 5.8 percentage points for the statewide referenda and 3.9 percentage points for the local referenda.  Some of the errors for the referenda were quite substantial, peaking at 12.8 percentage points [p. 229, emphasis added]

The second tells us that the mail in poll did slightly better in predicting the outcome of statewide issue referenda:

The average error for the Dispatch forecasts of these referenda was 5.4 percentage points, compared to 7.2 percentage points for the telephone surveys.  Thus, it appears that although referenda were more difficult to forecast than candidate races using either method, the mail surveys still outperformed the telephone surveys when doing so.  However, in one of the six referenda examined (Victim right in 1994), the telephone survey was much more accurate than the mail poll, so the superiority of the mail polls is not universal [emphasis added, pp. 231-232]. 

Why would a seemingly old-fashioned mail-in poll do so well?  My post last year reviewed some of the reasons provide in the POQ piece: 

  • It involves a very large sample size. The survey released over the weekend had 2,880 respondents and sampling error of 2%. No other Ohio poll released this week has even half as many respondents.
  • It is a better solution to the problem of "modeling" likely voters: "The mail survey procedure recruited samples of respondents that more closely resembled the actual voting population, perhaps because the act of completing a self-administered questionnaire is in many ways comparable to the act of voting" (emphasis added).
  • The lack of an undecided option: "Having to allocate those undecided voters introduced error into the telephone polls." 
  • The mail survey was a closer facsimile of the ballot than the questions pollsters typically use to ask about vote preference.

I also discussed why one of the presumed shortcomings of the mail in survey — it’s purportedly low response rate — now compares favorably to other polls given the way telephone response rates have plummeted.  See the original post for all the details.

Now some came away from last year’s election thinking the Dispatch poll was not so accurate in 2004.  That may have been a consequence of MP’s boosterism, but it was not the case.  The Dispatch poll was as accurate as could be expected.  The final poll showed a 50-50% dead heat.  The final result had Bush winning 51% to 49%.  That result fell well within the survey’s 2% margin of error (although another poll conducted by telephone, the University of Cincinnati Ohio Poll, was even more accurate, nailing the 51-49% result). 

Today, Ohio like California is holding a special election to consider five ballot initiatives and the Columbus Dispatch released a survey on Sunday with results for all five issues.  As always, the questions on the Dispatch poll used the exact language that appears on the Ohio ballot, although in this case the pollsters presumably provided respondents with a "don’t know" option (they certainly reported answer for "don’t know" that varied from 9% to 25%).   Those who, like MP, are interested in which methodologies will perform best should once again keep a close eye on Ohio. 

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.