Sneaky Plame Poll? Part II

Legacy blog posts Measurement Issues Polls in the News

So, as promised in Part I, let’s continue to consider the post on Redstate.org last week by Jay Cost (aka the Horserace Blogger)  that sharply criticized a recent ABC news poll on the Plame/Wilson/Rove imbroglio.  Cost had some additional criticisms I did not address, but will below.  Two are specific to this poll, but one is of much broader general interest.  As it happens, reader FR emailed to raise a similar query:  "Why should we trust public opinion polls on issues where respondents probably know very little about the topic?"   That is a very good question.

First, a review.  Cost’s post zeroed in on three questions from the ABC poll.  I’ve also copied those questions below, in the order they were asked, along with a few others that ABC asked about Rove/Plame, et. al. (for full results, see the ABC pdf summary):

1. As you may know, a federal prosecutor is investigating whether someone in the White House may have broken the law by identifying an undercover CIA agent to some news reporters. One reporter has gone to jail rather than reveal her source. How closely are you following this issue – very closely, somewhat closely, not too closely or not closely at all?

2. Do you think this is a very serious matter, somewhat serious, not too serious or not serious at all?

3. Do you think the White House is or is not fully cooperating with this investigation?

4. It’s been reported that one of George W. Bush’s closest advisers, Karl Rove, spoke briefly with a reporter about this CIA agent. If investigators find that Rove leaked classified information, do you think he should or should not lose his job in the Bush administration?

5. Do you think the reporter who has gone to jail is doing the right thing or the wrong thing in not identifying her confidential source in this case?

Cost had criticisms about the first question that we discussed in the last post.  He also raised other objections.  I’d like to comment on three:

A) The ABC release did not include "any kind of cross-tabulation to see if those who are not paying attention are the ones who think it’s serious."   He goes on to  speculate that the 47% that are not paying attention (on Q1) might be the bulk of the 47% that thinks the White House is not cooperating (on Q3).

It is certainly true that ABC did not provide complete cross-tabulations of these questions by the attentiveness question, but they did characterize what such cross-tabs would show.  As a commenter on RedState.org points out, the ABC release did include the following text:

Those paying close attention (who include about as many Republicans as Democrats) are more likely than others to call it very serious, to say the White House is not cooperating, to say Rove should be fired if he leaked, and to say Miller is doing the right thing.

So, Cost guessed wrong.  On this count, ABC is not guilty of trying to "make it seem like" the public is less happy with the White House than it is.

Now, MP certainly agrees (and apparently, so do the analysts at ABC) that such a cross-tab analysis is appropriate.  MP would always like to see more data than less, although in this case, ABC certainly provided enough information to allay Cost’s suspicions.

On the other hand, MP does not agree with Cost when he asks rhetorically, "Why should we care what the uninformed think on the matter?"   Two reasons:  First, in this instance at least, ABC asked about attentiveness, not information level (although one is a reasonable surrogate for the other).  Second, people will sometimes possess strong opinions about issues they are not following closely. 

Consider, for example…Jay Cost.  He tells us in his opening line that, "I really have no interest in this Plame/Wilson/Rove ‘scandal.’"  He may not have any interest, but he certainly seems to have an opinion (the quotation marks around the word scandal seem like a pretty good hint).  How would he feel if a pollster threw ignored his opinion (and those with similar views) just because his interest level is low?   I’m pretty sure I know the answer. 

B) Cost argues that the information about the Rove/Plame affair provided in the first question of the series provides a "frame" that influences respondent answers to the questions that follow

Here MP must concede that Cost may have a point.  Survey methodologists have shown that "order effects" can occur.  Put another way, questions asked early in a survey can affect the answers provided to questions that follow.  We noted a few weeks back that,

Small, seemingly trivial differences in wording can affect the results.  The only way to know for certain is to conduct split sample experiments that test different versions of a question on identical random samples that hold all other conditions constant.

Unfortunately, the academic research on this issue is relatively limited. We know that order effects can occur, but often do not.  The only way to know for sure is with extensive pre-testing and split sample experiments which public and campaign pollsters rarely have the time or budget to conduct.  So we try to follow some general rules:  We try to write questionnaires to go from the general to the specific, from open-ended questions to those that provide answer categories, from those that provide little or no information to those that provide a great deal. 

MP will grant that he is a bit uncomfortable with the amount of information provided in the first question.  We also tend to agree with Cost that it is odd to ask respondents if they consider this a "serious matter," after informing them that it involves breaking the law "by identifying an undercover CIA agent," and that a reporter has already gone to jail.  How is that not "serious?"  Nevertheless, MP doubts that the first two questions provide anywhere near the sort of bias or "framing" effect that Cost hypothesizes. 

As for the other questions, we can speculate about it endlessly.  Different pollsters will take different approaches.  Consider the recent survey by Gallup on this issue, released earlier this week (results also posted here).  They found results consistent with the ABC poll on how closely Americans are following the issue, but on a follow-up, found that 40% think Bush should fire Rove.  On the ABC poll, 75% said Bush should fire Rove "if investigators find that Rove leaked classified information."  Very different results, but also very different questions. 

C) The third and most important question that Cost raises is a more general one:  Can we trust any polls that ask about subjects about which respondents are poorly informed?

Cost argues:

Political scientists have found that when people are not paying much attention to an issue, they are quite susceptible to "framing effects" that can be created through question wording and question ordering (for more detail, see John Zaller’s The Nature and Origins of Mass Opinion, 1992).

This a good point, although MP does not agree that the ABC pollsters "designed" their poll "to give the impression that the public thinks something that it does not."  However, Costs more general point is worth consideration:  Just what should we make of polls about issues about which the public is poorly informed? 

Arguing that we should never poll on such issues is a non-starter.  The market will not tolerate it.  We follow (and argue about) poll questions on issues like these because we care about them.  Telling political junkies to ignore polls on such topics is like asking us to stop blinking.   Consider the blogger who warned on Election Day, "don’t pay attention to those leaked exit polls."  That sure worked.   

More importantly, political partisans are usually interested in how to persuade, how to move public opinion, not just where it stands now.  So we have good reason to want to gauge how the public will react to new information.  We just need to be careful in reporting the results to distinguish between questions that measure how the public feels right now, and those that provide a "projective" sense of how they might feel in the future. 

More specifically, MP has two pieces of advice for what to make of polls about issues on which the public appear to be poorly informed:

  • Be careful!  Pollsters can push respondents easily on such issues, and results can be very sensitive to question wording.  Any one poll may give a misleading impression.
  • As we have suggested before, look at many different polls.  No one pollster will have a monopoly on wisdom.  Yet use a resource like the Polling Report or the Hotline Poll Track Archives ($$), and you will begin to "asymptotically approach the truth" (as our friend Kaus often puts it). 

 

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.