Debates II: Instant Polls

Debates Legacy blog posts

As I mentioned in the last post, past history shows that the media’s coverage of presidential debates typically has more impact on voter preference than the debate itself. So, as is often the case, we will not know for sure what impact tonight’s debate has until we get back results from new surveys over the next week. However, if you are reading these words, and you’re a political junkie like me, you just can’t wait until next week. Are the instant polls done by the networks tonight worthy of our attention? My take on that follows on the jump page.

Instant polls done to assess the impact of the debate face two big challenges.

The first was anticipated by this set of questions posed by astute commenter Simka, who asked:
· What about people who work at night when pollsters call?
· What about people who go to school at night?
· What about people who work two jobs?
· What about people who work late?
· What about people who work swing shifts?
· What about people who are traveling? How many Americans are on the road at any given time for work or family

This question identifies one of the biggest sources of non-response (the other being those who simply hang up): At any given time, not everyone is home. Pollsters have to be persistent, and call over multiple nights at different times get to a reasonable number of those not always home. Although the specific procedures vary, most pollsters call at least four times over at least 2-3 evenings. This doesn’t get everyone, but gets close. Without rigorous “call-backs,” the sample will be biased against the people who are often away from home. That is why you rarely see one night polls done nationally.

That problem is greater for a post debate poll since, duh, people at home tonight will be more likely to watch the debate. As a result, most of the surveys done tonight will concentrate on those who actually watched the debate.

The second problem is one of interpretation. More often then not, the actual debate serves to reinforce voters’ preexisting preferences. In other words, all things being equal, Bush supporters will come away more impressed with Bush, Kerry supporters more impressed with Kerry. Thus, if Bush goes into the debates with a five-point margin among those in the sample, and the debate does nothing to change initial preferences, then expect the question that asks who “won” the debate to show a similar five-point margin.

The most sophisticated way to try to measure the impact of the debate itself would be to contact a random sample just before the debate, ask about their vote preference and attitudes toward the candidates, then call back just after the debate, repeat the same questions and ask the respondents to judge each candidate’s performance. Such a design allows the pollster to compare the reactions of pre-debate Kerry supporters to pre-debate Bush supporters and measure whether either candidate gained or lost support among individual respondents.

That is exactly the design used by the ABC News Polling Unit for all four of the presidential and vice-presidential debates in 2000. Here are a few examples of their results: The first debate looked like a dead heat (Gore fell in the polls in the week that followed due to coverage of the debate, but reaction to the debate itself was more favorable). Although slightly more Gore supporters (79%) than Bush supporters (70%) thought their man won, Bush’s support increased by 1 percentage point during the debate. After the second debate, 76% of Bush supporters judged their man the winner vs. 63% of Gore supporters who thought their man won. Bush’ margin grew from 10 to 13 points during the debate. (Press releases for all four of these surveys are still available online here, here, here and here).

Of course, one drawback of ABC’s approach is that, even with respondents waiting by the phone for the second interview, it will take at least an hour or so to complete the calls and tabulate the results. Other respondents might spend time watching the commentary following the debate before doing the interview.

The CBS polling unit tried a different approach four years ago. My sources tell me they will use the same approach tonight. In 2000, CBS conducted a survey online with a company called Knowledge Networks, which maintains a nationally representative “panel” of households that agree to do surveys on the Internet. What makes Knowledge Networks unique is that they recruit members to their panel with traditional random digit dial (RDD) sampling methods, and when a household without Internet access agrees to participate, they provide those households with free access to the Internet via Web TV. So in theory, at least, this approach allows a random sample of all US households.

The advantage of the KN online poll is that every selected respondent receives a pre-debate invitation, so they can log-on and fill out the survey immediately after the conclusion of the race. In 2000, for example, 617 surveys had been completed within 15 minutes of the conclusion of the debate (See the releases from 2000 by CBS, here and here, and by Knowledge Networks. Also, the raw data from these surveys are available to scholars here).

One disadvantage, at least in the way the CBS survey was done four years ago, was that they either did not do a pre-debate interview or did not report the post-debate results that way. Perhaps the design this year will be different.

The bottom line: If you are willing to be patient, the best methodology is the one that ABC News used four years ago, which interviews voters before and after the debate and compares results among pre-debate Kerry supporters and pre-debate Bush supporters.

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.