What About Those German Exit Polls?

Exit Polls Legacy blog posts

A commenter asked last week, "why are exit polls so much more accurate in Europe?" This is a question worth considering, because all surveys are not created equal. Differences in sample sizes, response and coverage rates and the experience and training of interviewers can tell us a lot about the potential for survey error.

I cannot claim personal expertise in European exit polls, but a Google search quickly turned up a rather contrary opinion published earlier this year by the ACE project (an acronym for Administration and Cost of Elections, a joint project funded by the UN and the US Agency for International Development):

[Exit poll] reliability can be questionable. One might think that there is no reason why voters in stable democracies should conceal or lie about how they have voted, especially because nobody is under any obligation to answer in an exit poll. But in practice they often do. The majority of exit polls carried out in European countries over the past years have been failures [emphasis added].

Presumably, the newfound belief in the accuracy of European exit polls comes from Steven Freeman’s evolving paper, the Unexplained Exit Poll Discrepancy. Freeman placed great emphasis on the "highly reliable" results from Germany. He showed results from exit polls conducted by the Forschungsgruppe (FG) Wahlen on behalf of the ZDF television network that were off by only 0.26% over the last three national elections.

Freeman also cited another "consistently accurate" survey, this one conducted by student volunteer interviewers in Utah. In 2004, according to Freeman, the Utah Colleges Exit Poll came within 0.2% of the actual result for the 2004 presidential race. "Consistently accurate exit poll predictions from student volunteers," Freeman concluded, "including in this presidential election, suggest we should expect accuracy, within statistical limits, from the world’s most professional exit polling enterprise."

Four weeks ago, after I posted my original critique of his paper, Freeman called seeking further input. My advice included the strong suggestion that he check on the methodologies used for the German and Utah polls before implying that the NEP surveys were comparable. As of this writing, Freeman’s paper still lacks any reference to basic methodological information regarding the German and Utah exit polls.

I made some inquiries about both, which I summarize below. For the sake of comparison, let’s begin with the methodological details of the National Election Pool (NEP) exit polls:

NEP Exit polls

  • State exit polls sampled 15 to 55 precincts per state, which translated in 600 to 2,800 respondents per state. The national survey sampled 11,719 respondents at 250 precincts (see the NEP methodology statements here and here)
  • NEP typically sends one interviewer to each polling place. They hire interviewers for one day and train them on a telephone conference call.
  • The interviewers must leave the polling place uncovered three times on Election Day to tabulate and call in their results. They also suspend interviewing altogether after their last report, with one hour of voting remaining.
  • The response rate for the 2000 exit polls was 51%, after falling gradually from 60% in 1992. NEP has not yet reported a response rate for 2004.
  • Interviewers often face difficulty standing at or near the exit to polling places. Officials at many polling places require that the interviewers stand at least 100 feet from the polling place along with "electioneering" partisans.

German Exit Polls (by FG Wahlen)

Dr. Freeman’s paper includes exit poll data conducted by the FG Wahlen organization for the ZDF television network. I was able to contact Dr. Dieter Roth of FG Wahlen by email, and he provided the following information:

  • They use bigger sample sizes: For states, they sample 80 to 120 polling places and interview 5000 and 8000 respondents. Their national survey uses up to 22,000 interviews.
  • The use two "well trained" interviewers per polling place, and cover voting all day (8:00 a.m. to 6:00 p.m.) with no interruptions.
  • Interviewers always stand at the exit door of the polling place. FG Wahlen contacts polling place officials before the election so that the officials know interviewers will be coming. If polling place officials will not allow the interviewers to stand at the exit door, FG Wahlen drops that polling place from the sample and replaces it with another sample point.
  • Their response rates are typically 80%; it was 83.5% for the 2002 federal election.
  • The equivalent of the German of the US Census Bureau conducts its own survey of voters conducted within randomly selected polling places. This survey, known as "Repräsentativ-Statistik," provides high quality demographic data on the voting population that FG Wahlen uses to weight their exit polls.

Dr. Roth also added the following comment: "I know that Warren Mitofsky’s job is much harder than ours, because of the electoral system and the more complicated structure in the states."

Utah Colleges Exit Poll

  • The Utah poll typically consists of 90 precincts and between 6,000 and 9,000 respondents. Compare that to the NEP exit poll for Utah in 2004, which sampled 15 precincts and 828 respondents.
  • Interviewers are student volunteers from local universities who attend an in-person training seminar.
  • They assign eight student interviewers to each precinct working in two shifts of four each. The larger number of interviewers allows coverage of all exits at each polling place all day long.
  • The response rate has typically been 60%, although the 2004 survey attained a response rate of roughly 65%.

In short, there are sound methodological reasons why the German and Utah exit polls typically obtain more accurate results: They do more interviews, attain better coverage and better response rates and use arguably better trained interviewers.

Of course, the lower response rates and coverage problems, in and of themselves, do not explain why the NEP exit polls had a slight Kerry bias this year. If the error was in the surveys, then something made Kerry supporters either slightly more cooperative or more likely to be interviewed. The fact that response and coverage rates were lower did not cause the error, it just made the potential for error that much greater.

12/20-corrected misspelling of Wahlen in original

[FAQ on Exit Polls]

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.