« Rasmussen: A Clearer (and Corrected) Picture | Main | Military Times Survey: Update »

January 04, 2006

The Military Times Survey

The Military Times publications released a unique new survey yesterday of active duty members of the US armed services who also subscribe to their newspapers.  It shows that while morale remains high, "support for President Bush and for the war in Iraq has slipped significantly in the last year among members of the military's professional core."   Those last few words are critical.  As we will see the Military Times readership comes disproportionately from those who have made the military their career and - as the Military Times article correctly stresses - "should not be read as representative of the military as a whole."  However, as a consistent, standardized sampling of opinion among the "professional core" of the military, it is worthy of our attention. 

The unique methodology of this survey reflects the extreme challenge of polling the active duty military population.  Needless to say, standard telephone survey methods - based on contacting respondents through wired telephone service in a household - are not much help.  The problem is a lack of what pollsters call a "sample frame," the list or map that identifies every member of the population.   There is no published listing of all active duty members of the military. So surveys of the military population are rare. 

However, the Military Times does have their own list of the names and addresses of the roughly 200,000 subscribers who receive one of their newspapers (the Army Times, Navy Times, Air Force Times, and Marine Corps Times) by mail.  Using this list, the Military Times was able to draw random samples of their subscribers and send out survey questionnaires in the mail. 

Before considering the mechanics of their mail survey, we need to consider one big issue:  Can we use the population of roughly 200,000 Military Times readers as a surrogate for the 1.4 million members of the active duty US military? The Military Times article provides some guidance: 

The results should not be read as representative of the military as a whole; the survey's respondents are on average older, more experienced, more likely to be officers and more career-oriented than the military population.

I asked some follow-up questions of Gordon Trowbridge, a senior staff writer at Military Times.  One of the biggest of these differences is the proportion of officers in the sample.  Slightly less than half the respondents (45%) reported holding the rank of officer compared to the roughly 15% to 20% of officers among all active duty military.   

How different are the attitudes of officers and enlisted personnel and do Military Times subscribers differ from other members of the armed services in terms of their partisanship or political ideology?  We can only speculate, though it is worth noting that 56% of the sample identified as Republican, only 13% Democrat. Similarly, 50% identified as conservative, only 7% as liberal. 

[Note:  Actually Anyone with a statistical package and a little time on their hands can determine how officers differ from enlisted personnel within this sample, because Military Times has posted the raw, respondent level data online.  If any readers have the time and inclination to cross-tab the results by rank (officer vs. enlisted), please post a comment or email me and I'll update this post accordingly].  [Update:  Reader MC takes up the challenge!]

Now, if we can get past the conceptual hurdle of sampling from the Military Times subscriber lists, there are two more questions:  First, let's consider how they conducted the survey and the possibility that the mail-in mode had any affect on the results?

The survey procedures, as described by Towbridge, are thorough and designed to maximize the response rate:

We mail out company-logo stationary containing a cover letter, the questionnaire, a return-mail envelope and a separate business-reply card. The questionnaire goes to the tabulation firm; the card comes to our office, which we use to gather names of people willing to do follow-up, on-the-record interviews. We send out the initial mailing, followed up two weeks later with a reminder postcard to those who haven't responded, and then shortly after that we re-mail the entire package to those we haven't heard from.

This year they sent out 6,000 surveys and received 1,215 back from those on active duty.  However, according to Towbridge, Military Times believes ("based on our circulation department's research") that roughly a third of the subscriber base at any given time is not on active duty.  Since the survey aims to reach only those on active duty, we would calculate the response rate by dividing the 1,215 respondents by 4,000 (the best estimate of those on active duty).  The results is 30%, a response rate that compares favorably to the mail surveys conducted by the Columbus Dispatch (which are typically in the mid 20s) and most national public telephone surveys (a 2003 study found an average response rate of 23% on 20 national news media telephone surveys, ranging from a low of 5% to a high of 39%, using the AAPOR RR3 formula).

However, the response rate alone is less important than the possibility of any response bias.  In other words, were the 70% who did not return their survey different from those who did, and if so, how different were they?    That question is, of course next to impossible to answer or quantify, since as usual we know nothing about the non-respondents.  However, the Military Times article includes this bit of information which should at least serve as a caution:

As in the previous two years, Military Times Poll respondents were reluctant to express opinions, even anonymously, about the commander in chief or his policies. About one in five refused to say whether they approved of the president's performance on Iraq or overall.

"That's my boss," Army Lt. Col. Earnestine Beatty said in a follow-up interview. "I can't comment." Kohn said he worried that asking such questions of military members and publishing the results could tarnish the military's image as a nonpartisan institution.

Whether any such reluctance affected the response rate or created any sort of response bias is, again, something we can only speculate about.  However, Trowbridge points out that among respondents, the reluctance to answer questions was evident mostly on questions about George Bush, Congress or the military brass.  On other questions, such as those about morale, the "don't know" percentages were in the single digits.

One last issue to consider:  Even if the Military Times subscriber base is not an ideal proxy for all active duty military personnel, the poll did work to sample exactly the same universe in exactly the same way using a largely consistent set of questions three years in a row.  Moreover, if we assume that the proportion of subscribers on active duty has remained a constant 66% over the last three years, the response rate has also been reasonably consistent:  31% in 2003, 35% in 2004 and 30% in 2005.  The trends they uncovered (discussed at length in the article) should be "real" and worthy of our attention. 

So, to sum up:  The use of the Military Times subscriber list as a sample frame gets us to as close to a random sampling of active duty military personnel as we are likely to get.  However, it is best to think of the poll as consistent three year sampling of "the military's professional core" (as the Times' lead puts it) than of all the men and women serving on active duty.

[typos corrected]

UPDATE (12-05):  See the helpful comments below and the update based on a readers crosstabs of the Miltary Times data.

Related Entries - Polls in the News

Posted by Mark Blumenthal on January 4, 2006 at 05:16 PM in Polls in the News | Permalink


Needs a lot more cross tabs. Army and Marines are 58% of respondents but those services are just about the only ones impacted by Iraq. 39% of respondents have not been deployed to Iraq/Afghanistan.

Posted by: jimbo | Jan 5, 2006 5:40:23 AM

I served 8 years as an enlisted man in the Navy (back in the 1980s). I think this survey is interesting, but not a valid cross section of the military. First, it is heavily weighted by officers (which doesn't surprise me). Officers are, by definition, college educated, while very few enlisted personnel have a degree (on the occasion that an enlisted person gets a college degree, they are often promoted into the officer ranks). So the survey will have a significantly higher percentage of college educated respondents than is average in the military. In addition, by their own assessment, their list consists of "the military's professional core". I read this to mean "mostly officers and senior enlisted people, and mostly people who serve more than one enlistment". So they are therefore missing a huge chunk of junior enlisted personnel serving in their first enlistment--the very people who make up the bulk of the actual front line combat troops. The survey, then, misses most of the troops actually getting killed in any significant numbers on the front lines.

Also worth considering is the issue of reserve and national guard troops. Are they even included in this survey at all? The Military Times does not appear to say what percentage, if any, of its readership is from the reserve or guard. This is the first war in which large numbers of reserve and guard soldiers are being activated for long periods of time. Anecdotally, many of them I know are pretty steamed over this. They strongly feel that this is not what they signed up for; the generally accepted presumption is that they will be activated in times of national emergency for short periods of time, not be used to fight a long and protracted war because there isn't a large enough standing army for the job. Many are pretty bitter about it (admittedly this is just people I know, and may not represent a cross section of reserve and guard).

So while this may be the best survey that is possible for active duty military, it should by no means be considered a good cross section. If their polling methods and questions are consistent, the change in trend over 3 years is probably the more interesting point, not the actual numbers.

Posted by: ex-Navy in SEA | Jan 5, 2006 4:35:51 PM

The comments to this entry are closed.