AAPOR: Day Two

General Legacy blog posts

A few quick notes from some of the sessions I attended yesterday at the AAPOR conference:

Keep in mind that the "working papers" presented at AAPOR (actually most are currently PowerPoint presentations) are just that – works in progress.  Also, I can only sit in on one of the eight presentations that typically occur at any given time, so what follows is just a tiny fraction of the amazing variety of research findings presented today.  Finally, I am sharing my own highly subjective view of what’s "interesting."  I’m sure others here might have a different impression. 

Party ID – I have written previously about the idea that party identification is an attitude that can theoretically show minor change in the short term.  This morning, Trevor Tompson and Mike Mokrzycki of the Associated Press presented results from an experiment showing that the way survey respondents answer the party identification question can change during the course of a single interview.

Throughout 2004, the IPSOS survey organization randomly divided each of their poll samples, asking the party ID question for half of the respondents at the end of the survey (where almost all public pollsters ask it) and for the other half at the very beginning of the survey.  When they asked the party question at the end of the questionnaire, they found that consistently more respondents identified themselves as "independents" or (when using a follow-up question to identify "leaners") as "moderate Republicans." They also found that the effect was far stronger in surveys that asked many questions about the campaign or about President Bush than surveys on mostly non-political subjects.  Also, they found that asking party identification first had also had an effect other questions in between.  For example, when they asked party identification first, Bush’s job rating was slightly lower (48% vs. 50%) and Kerry’s vote slightly higher (47% vs. 43%).

A few cautions:  First, these results may be unique to the politics of 2004, the content of the IPSOS studies or both.  The effect may have been different for say, a Democratic president at a time of peace and prosperity.  Second, I am told that another similar paper coming tomorrow will present findings with a different conclusion. Finally — and perhaps most importantly — while the small differences were statistically significant, it is not at all clear which placement gets the most accurate read on party identification. 

Response Bias and Urban Counties – Michael Dimock of the Pew Research Center presented some intriguing findings on an examination they did on whether non-response rates might result in an overunderrepresentation of urban areas.  The basic issue is that response rates tend to be lower in densely populated urban areas, higher in sparsely Although the Pew Center uses a methodology that is far more rigorous than most public polls (they will, for example, "call back" numbers at least 10 times to catch those typically away from home), even though they weight their samples to match demographic characteristics estimated by the US Census, they still found that they under-represented voters from urban areas.  Thus, in 2004, they also adjusted their samples to eliminate this \geographic bias. 

The result, according to Dimock, was typically a one point increase in Kerry’s vote, a one point drop in Bush’s vote (for a two-point reduction of what was usually a Bush lead).  Thus, Pew’s final survey had Bush ahead by a three-point margin (48% to 45%) that more or less nailed the President’s ultimate 2.4% margin in the popular vote.  But not for the geographic correction, their final survey would have shown a 5% Bush lead.

After the presentation, a representative of CBS News pointed out that their survey also made a very similar weighting correction to adjust for geographic bias.  While all of this may sound a bit arcane, it reflects an important change for these public pollsters who rarely weight geographically.

The User’s Perspective – For MP, one very heartening development at this conference was discussion of the idea that in considering the overall quality of a survey, pollsters need to consider the perspective of the consumers of their data.  Two very prominent figures within the survey research community, Robert Groves of the University of Michigan and Frank Newport of Gallup, both endorsed the view that one important measure of the quality of a survey is its "credibility and relevance" to its audience.  Put another way, these two leaders of the field argued this week that pollsters "need to get more involved" in users of survey data perceive their work. 

For my money, there no "users" of political survey data more devoted than political bloggers.  As Chris Bowers wrote on MyDD yesterday,

Without any doubt in my mind, I can say that political bloggers are by far the biggest fans of political polling in America today. We are absolutely obsessed with you and what you do. Many of us subscribe to all of your websites. We read your press releases with relish, and write for audiences that are filled with hard-core consumers and devotees of your work. In Malcolm Gladwell’s terminology, political bloggers and the many people who visit and participate in political blogs are public opinion mavens who can almost never consume too much information about the daily fluctuations of the national political landscape.

Chris is absolutely right.  If political pollsters want to understand more about how their most devoted consumers feel, there is no better place to go than the blogosphere.

PS:  Actually, the Chris Bowers quotation above is from the speech he will present by videotape tomorrow at a session at the AAPOR conference on blogging’s impact on polling in which I am also a participant.  Chris put the text of his presentation online, and those interested can view his readers’ reactions to it here.  Also, Ruy Teixeira has posted a summary of the various polling issues he discussed on his blog during the campaign.

5/15 – Typo corrected

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.