Update: The MyDD Poll

Legacy blog posts Polling & the Blogosphere

The MyDD survey that we discussed last week is now complete, and the site is rolling out the results over the weekend (here, here and here so far, with a detailed description of the methodology here).  To be sure, this is a survey with a partisan sponsor and some of the questions it asks reflect the activist-liberal views of the MyDD readership.  Those with a different point of view may well see no added value in the MyDD survey, or see some biased intent in some of the questions.  Such is the challenge of surveys with partisan sponsorship.  However, blogger Chris Bowers has endeavored to conduct a credible survey using a professional pollster, a traditional methodology and a sample that represents all Americans.  He has also tried hard to not only meet the usual standards for disclosure but to significantly exceed them.  Whatever you think of the substance of the MyDD questions, they deserve credit for their commitment to professionalism and transparency. 

Bowers and Wright briefed me on their efforts on Friday and, not surprisingly, we discussed very little that they have not subsequently disclosed online.  The only exceptions are the remaining results that they will post online over the next few days. 

Here are two quick methodological notes of interest to MP readers:  First, MyDD pollster Joel Wright had considered an unusual method to weight the data by self-reported party registration (rather than identification).  After much consideration, Wright decided against weighting by party, because his method would have raised the percentage of self-identified registered Democrats from 33% to 40% of his sample.  Wright explains in a comment posted on MyDD:

There’s a decent case to make that this [40% Democrat weighting] is correct, from the data I compiled and also a few other sources. However, most other polls don’t show that figure for Dems. That would have caused a controversy. It would have misdirected discussion about the poll into a discussion about proper proportions, etc. We would have been talking about weights instead of data and findings. Accused of partisanship when it’s not so in the least. All in the first time out for the poll. So I made a command decision, a very hard decision to make, late last night to go with conventional method in the best interest of the poll. This time.

Wright has a point about the potential for controversy.  MP certainly would have had some concerns, but as Wright is headed back to the drawing board on his weighting methodology I will save my questions for another day.

Separately, Wright says that MyDD plans to make the raw data available for downloading within about a month, so that those interested in doing their own tabulations or analysis can have at it. 

Finally, let me close with some general comments about partisan polls.  In my last post on the MyDD poll, I wrote that "polls sponsored by partisan groups are nothing new."  Regular MP reader Robert Chung (creator of this worthy page on survey house effects) asked in a comment if I meant that "sponsorship means the results are not reliable?"  That was not my meaning, but he raises a good question nonetheless.  How reliable are partisan sponsored polls?

Although most of the polls reported on in mainstream media are sponsored by the media outlets themselves, political parties, candidates and interest groups also routinely conduct their own surveys.  Readers should remember that MP earns his living conducting polls for Democratic political candidates, and as such, may not be the most impartial source on this question.  However, my perspective is that my "partisan" clients expect, first and foremost, that we provide accurate and reliable results, so our professional duty is to get the numbers right, without regard to their potential propaganda value.   As such, data from the internal polls that drive campaign strategy rarely see the light of day.    

However, as many of you know, partisan pollsters do sometimes release survey results selectively when they cast our clients in a favorable light.  Academics who analyze public polls systematically (such as this recently published study, and also this paper) find that releases from partisan pollsters show a consistent bias.  That is, publicly released pre-election polls from Democratic pollsters tend to be few points more favorable to Democrats, and polls from Republican pollsters tend to be a few points better for Republicans.  Obviously, pollsters debate the reasons for this pattern, but most believe "selection bias" explains a lot of it.  Consider it this way:  Random sampling error means that all polls have a range of variation.  If partisans release only those results that fall on the positive half of the bell curve (from their perspective), the released results will show a consistent bias.  Either way, data consumers should take public releases of partisan results with the appropriate grain of salt. 

The usual pattern of cherry-picking of results is one reason to be encouraged by the precedent of the MyDD poll of putting out so much detail about the survey in advance.  They announced ahead of time that they were fielding the poll, released several drafts of the questionnaire for reader review and comment and announced when their survey went into the field.  MP can think of no other news media or partisan poll that discloses so much in advance.  MyDD’s only hesitation was in releasing the final version of their questionnaire in advance out of a fear that the wording would be criticized before they had a chance to release data.  MP hopes that in future surveys they will take that extra step, because it would help boost the credibility of their claim to release all data, warts and all.

Yes, of course, MyDD is a liberal activist site, and the content of their survey reflects that world view.  Conservatives may well take issue with some of the question wording or see bias in their analysis.  That is the nature of partisan polling.  However, I hope all will appreciate the professionalism and commitment to transparency in the MyDD approach.  They are setting a good precedent for others to follow.  MP sincerely hopes a site on the conservative side of the blogosphere follows their example.   

Mark Blumenthal

Mark Blumenthal is the principal at MysteryPollster, LLC. With decades of experience in polling using traditional and innovative online methods, he is uniquely positioned to advise survey researchers, progressive organizations and candidates and the public at-large on how to adapt to polling’s ongoing reinvention. He was previously head of election polling at SurveyMonkey, senior polling editor for The Huffington Post, co-founder of Pollster.com and a long-time campaign consultant who conducted and analyzed political polls and focus groups for Democratic party candidates.