Skip to main content

Assessing and adjusting for non-response in the Millennium Cohort Family Study



In conducting population-based surveys, it is important to thoroughly examine and adjust for potential non-response bias to improve the representativeness of the sample prior to conducting analyses of the data and reporting findings. This paper examines factors contributing to second stage survey non-response during the baseline data collection for the Millennium Cohort Family Study, a large longitudinal study of US service members and their spouses from all branches of the military.


Multivariate logistic regression analysis was used to develop a comprehensive response propensity model.


Results showed the majority of service member sociodemographic, military, and administrative variables were significantly associated with non-response, along with various health behaviours, mental health indices, and financial and social issues. However, effects were quite small for many factors, with a few demographic and survey administrative variables accounting for the most substantial variance.


The Millennium Cohort Family Study was impacted by a number of non-response factors that commonly affect survey research. In particular, recruitment of young, male, and minority populations, as well as junior ranking personnel, was challenging. Despite this, our results suggest the success of representative population sampling can be effectively augmented through targeted oversampling and recruitment, as well as a comprehensive survey weighting strategy.

Peer Review reports


Military families and communities serve as the cornerstone of support for US service members and may themselves be heavily impacted by military experiences including deployment [16]. The US Administration, the Department of Defense, and the Institute of Medicine have identified military family research as a national priority, encouraging studies that examine the unique risk and resilience factors among military families and assess the resources they need in order to promote familial adjustment and optimal health outcomes [79]. While the literature in this area has progressed substantially in the last decade [1012], there is much to learn about the functioning of military families, the influence of family interactions on service member recovery and resiliency, and the relationship of military experiences, such as deployment with family well-being. Population-based survey research is uniquely positioned to address these critical research questions and inform policy decisions to support service members and their families; however, very few longitudinal studies have been conducted with representative samples of military families. In recent years, several large-scale studies have been implemented among active duty military families and spouses from all US service branches including the Deployment Life Study [13], the Millennium Cohort Family Study (Family Study) [14], the Intergenerational Impact of War Study [15], the Military Family Life Project [16, 17], and the Survey of Active-Duty Spouses [18]. However, several of these studies excluded important subgroups (e.g., dual military, reserve couples) which limits the generalizability of their findings, and only three (Deployment Life Study, Military Family Life Project, and Family Study) are positioned to address questions regarding the trajectory of experiences for military families due to their longitudinal design.

The Family Study, a dyadic longitudinal survey of married spouses and military members with two to five years of service, addresses several of these gaps. The study aims to examine the well-being of military spouses and children in a probability-based longitudinal sample of service members from all branches and components of the military. As a result, it provides a unique empirical resource for addressing critical scientific, operational, and policy questions, and for informing the development of interventions promoting resilience among service members and their families. In addition to its robust design as a prospective, longitudinal survey of a probability-based sample across military branches, the Family Study addresses important gaps in the literature by including under-studied subsamples, such as Reserve and National Guard families, dual military couples, and male spouses. The study further plans to follow families over 21 years, a much longer period of time than ever previously attempted. Only married couples of opposite sex were included in the study; thus results may not generalize to lesbian, gay, bisexual and transgendered spouses or to single-parent households. The Family Study is an adjunct to the Millennium Cohort Study, which is the first US population-based prospective study to investigate long-term health effects of military service among active duty military. The Millennium Cohort Study was launched in 2001 and has enrolled four panels of service members; the first wave of enrollment for military spouses in the Family Study occurred concurrently with recruitment for the fourth panel (Panel 4; 2011–2013). The Family Study baseline employed a nested design in which spouses of service members who completed the Millennium Cohort survey were subsequently invited to participate in the study.

Unit non-response is an inevitable feature of population-based surveys, and it is important to thoroughly examine and adjust for potential non-response bias to improve the representativeness of the sample prior to conducting analyses of the data and reporting findings [19, 20]. The literature on non-response in population-based surveys indicates the following sociodemographic characteristics are often associated with greater survey response in the civilian population: employment [21, 22], middle versus younger age [21, 23], female gender [23], higher socioeconomic status/income [24, 25], and higher levels of formal education [26, 27]. Surveys of the military population generally reveal a similar pattern of greater response associated with these sociodemographic variables [2830], as well as military-specific variables, such as active versus reserve duty status, officer versus enlisted status, and senior versus junior pay grade/rank [13, 27]. However, very little is known about the response patterns/behaviors of military spouses and family members. Although several spousal survey studies adjust for non-response [13, 16, 31], very few have provided any detailed discussion on the factors associated with spousal non-response. Findings suggest that a higher percentage of response is likely from officer-headed households across services [13]. In longitudinal surveys, participation in early data collection waves and fewer missing items are also associated with future participation [22, 32]; therefore, study participation is a predictor of future response. Less is known regarding the influence of psychosocial factors on non-response in the military population, as researchers typically do not have such measures on non-responders or on a proxy for the non-responder (e.g., family member). Some authors, however, suggest that non-respondents may generally be less healthy [25, 33, 34], have more substance use disorders [25, 34], or a history of psychiatric conditions [27].

This current study examined factors contributing to second stage survey non-response during the baseline data collection for the Millennium Cohort Family Study conducted from 2011 to 2013. Due to its nested design within the larger Millennium Cohort Study, the Family Study offers a unique opportunity to thoroughly examine and adjust for non-response bias among military spouses by analysing extensive data collected from their service member partners on the Millennium Cohort survey, including sociodemographic and psychosocial characteristics. This study contributes to the interpretation and use of the Family Study data by describing the sample and examining and addressing systematic non-response in the baseline sample. The study provides insights to inform future study designs and recruitment practices involving military spouses.


Study design

This study was an examination of second stage non-response bias to the baseline wave of a population-based, longitudinal survey of spouses of US active duty military conducted in 2012. We also present weighted population estimates of the sociodemographic characteristics of survey respondents, accounting for both sampling design and two stages of dyadic non-response.

Study population, data, and procedures

Married spouses of participants in the Millennium Cohort Panel 4 were invited to participate in the study. This panel included only military members with two to five years of service randomly sampled across service branches and components. Female and married service members were oversampled to ensure adequate representation in the Family Study, particularly male spouses of female service members. Response rates for the Millennium Cohort have been described elsewhere (Williams C, Battaglia MB, Corry NH, McMaster H, Stander V. Millennium Cohort Family Study Weighting Analyses Overview document, 2016). Following completion of the Panel 4 survey, married service members were given the option to refer their spouse to the Family Study and provide their contact information, although this was not a requirement for spousal participation. Spouses who were referred by the service member were recruited via email and postal mail, while non-referred spouses were contacted by postal mail only. In order to successfully engage spouses, additional enrollment strategies included systematic variation in the style of recruitment solicitations, minimal ($5–$10) pre- and post-incentives, as well as both online and paper mail survey response options. The Family Study methods are described in more detail elsewhere [14, 35].

For the 28,603 service members who completed the Millennium Cohort survey and were eligible for the Family Study survey, we had extensive self-report data on which to model spousal non-response. Therefore, we utilized both administrative and service member self-report data to assess non-response bias among spouses. The study was overseen and approved by the Naval Health Research Center’s Institutional Review Board (Protocol 2000.0007) and the Office of Management and Budget (approval number 0720–0029). Informed consent was obtained for all participants.


Survey administration

Service members' study participation status was determined by the percentage of “base” items completed on the Millennium Cohort survey (i.e., items that were asked of all participants); a participant was designated as a completer if >80% of the base items had responses and as a partial completer if ≤80% of the base items had responses. All Millennium Cohort respondents were categorized into one of two recruitment groups: those who referred their spouse for the Family Study and those who either refused to refer their spouse or submitted the survey without responding to the referral item.

Sociodemographic data

Sociodemographic and military data were obtained from service member military records and included gender, birth date, race/ethnicity, education, branch of service, service component, military pay grade, military occupation, deployment, and number of dependents.

Mental, physical, and social well-being

The Millennium Cohort survey solicited service member self-report data on a variety of topics, including medical conditions, psychosocial well-being, substance use, and military-specific and occupational exposures. Major depression was assessed by the Patient Health Questionnaire (PHQ) [36] and mental and physical health component scores were derived from the Veterans RAND 36 Item Health Survey (VR-36) [37, 38]. Stressful life events were assessed by the modified Holmes and Rahe scale [39, 40]. Posttraumatic stress disorder (PTSD) was assessed by the PTSD Checklist-Civilian Version [41]. Risk and resilience indicators were also assessed, including posttraumatic growth, from a modified Post Traumatic Growth Inventory [42], and self-mastery, by items from the Pearlin–Schooler Mastery Scale [43]. Health conditions and behaviours were assessed by the Insomnia Severity Index [44], CAGE questionnaire [45], and selected items from the National Health and Nutrition Examination Survey [46] and the National Health Interview Survey [47].

The Family Study survey consisted of approximately 100 items spanning spousal physical and mental health, reports on children’s adjustment, and family functioning. Many of the measures in the spousal survey were identical to those administered in the Millennium Cohort service member survey, so that outcomes could be compared and examined for the spousal dyad. More information about the Family Study survey instruments are available elsewhere [14]. For most of the analyses presented here, the key measure was simply whether or not the spouse completed the Family Study survey (i.e., responded to at least one survey item).

Statistical analyses

Modeling family study non-response

We conducted bivariate analyses including Chi-Square tests and bivariate logistic regressions to identify key service member characteristics associated with spousal response. Subsequently, we used multivariate logistic regression analysis to develop a comprehensive response propensity model. Because methodological and background variables were most strongly associated with non-response and are more generally available in studies of non-response, the first multivariate logistic regression model included all sociodemographic, military, and administrative variables. In the second model, other service member characteristics that were bivariately associated with non-response were offered for stepwise addition (P < 0.05) to the first model. In order to include all Family Study eligible Millennium Cohort respondents (N = 28,603) in modeling non-response, missing cases were assigned the modal response for items with a very small amount of missing data (e.g., education). For other items, a “missing” category was created as one of the response levels for the analyses.

Testing the non-response models

We used several techniques to evaluate the model, since one of the purposes for developing a comprehensive model of spousal non-response was to adjust the survey weights for non-response bias. First, we computed the area under the receiver operating characteristic curve (C-statistic), which provided an estimate of the overall model fit. We also compared the distribution of the predicted probabilities of response derived from the models to ensure there was a broad distribution of response propensity. Second, in developing the propensity model, we held out several key service member variables that were strongly correlated with important Family Study outcomes; these variables could then be used as reasonable proxies in validating the effectiveness of the model to adjust for response bias in important outcomes [48]. The ideal response propensity model is predictive of non-response and key survey outcome variables (i.e., the model can account for non-response bias in important outcomes). Finally, to ensure our non-response model was comprehensive, we assessed whether adding the previously held out service member variables to the phase 2 model would have contributed substantively to its predictive power.

Developing family study weights

In developing survey weights, we first accounted for the Millennium Cohort design features and non-response bias because the Family Study was nested within the Millennium Cohort Study. We created sample design weights for the stratified (i.e., by gender and marital status) sampling frame (n = 250,000) to generalize to the population of military personnel with two to five years of service (n = 573,437). We then adjusted the sample design weights for Millennium Cohort non-response using raking ratio estimation [49] so the marginal totals of the adjusted weights matched those for the population [50]. The available service member data for this raking process was limited to demographic and service characteristics documented in military records and included gender, age, race/ethnicity, pay grade, service branch, and military component (active duty vs. Reserve/National Guard). The Millennium Cohort weight (MilCo weight) could be applied to analyses involving Panel 4 participants eligible for the Family Study (N = 28,603) and was used for all non-response analyses presented in this paper.

In the next stage of Family Study weight development, we used the estimated spousal response propensities derived from the logistic regression model described above to directly adjust the weights, generating weights adjusted for both Millennium Cohort and Family Study non-response [51]. As a final step, we raked the weights again to known population totals for gender, race/ethnicity, age, pay grade, and service branch, as well as the family study response propensity quintile and trimmed them to reduce weight variability without sacrificing the approximation to those population totals [52]. The resulting weight can be applied to statistical analysis of the 9872 dyads that participated in the Family Study. We evaluated the effectiveness of the weights in reducing non-response bias by comparing the prevalence of sociodemographic characteristics estimated in three ways: unweighted, applying the MilCo weight, and applying the final Family Study weight.


Baseline characteristics and study participation

A total of 9872 spouses responded to the Family Study survey, out of 28,603 eligible service member respondents, for an overall response rate of 34.5% (34.3% using the MilCo weight). For all eligible Millennium Cohort service members, Table 1 presents the prevalence of each of the demographic, administrative, military, and individual adjustment characteristics included in our non-response model, as well as percentages of spouses responding to the Family Study survey by subgroup. Table 2 further lists Millennium Cohort participant characteristics that were found to have a statistically significant relationship with spousal response in bivariate analyses, but were not included in the final model.

Table 1 Characteristics of Millennium Cohort participants and Family Survey response – included in final response propensity model
Table 2 Measures examined for association with Family Study response – not included in final response propensity model

The majority of service member sociodemographic, military, and administrative variables were significantly associated with greater spousal response, including male gender, older age, white/non-Hispanic ethnicity, higher education, an income between $25,000–$74,999, having dependent children, serving in the Reserve/National Guard versus active duty, having been deployed with known combat status, completing the full Millennium Cohort survey, and referring spouse to the Family Study. Service member health behaviours such as smoking, low or high sleep duration (vs. intermediate), greater caffeine intake, and sedentary activities, as well as poorer overall physical health indices, were associated with lower survey response. Positive mental health indices were associated with greater likelihood of spousal response, including the absence of major depression, PTSD symptoms, medication, and more posttraumatic growth. Fewer financial and social problems, including fewer difficulties with partner and stressful life events, were also related to greater spousal response.

Although many of these comparisons were statistically significant, the most substantive differences were observed for a fairly small number of demographic and administrative variables. For example, the single strongest predictor of spousal response was whether the service member referred him/her for participation, with 64.5% of referred spouses participating compared to only 22% of non-referred spouses. Other factors associated with at least a 50% difference in spousal response from Table 1 included gender, with spouses of male service members more than twice as likely to respond as spouses of female service members (37.1% vs. 16.6%); race/ethnicity (with spouses of minority service members much less likely to respond than those of white, non-Hispanic service members); education (with spousal response rates increasing steadily from 19.9% for service members not completing high school to 46.7% for those with an advanced degree); and income (with those in middle income groups much more likely to respond than those in the highest income group). Notably, for several service member measures, spousal response was considerably lower for those with missing data on the given measure; there was less variation among the non-missing categories. Further, missingness was positively correlated across variables, as evidenced by the much greater spousal response for service members completing the entire survey (35.1%) compared with partial completers (20.8%).

Family Study non-response models

Table 3 presents results from the weighted logistic regression models. In the first model, only the service member sociodemographic, military, and administrative variables were included. The second model shows other characteristics that were selected in the subsequent stepwise regression because they were significantly associated with response, above and beyond the variance accounted for by the sociodemographic, administrative, and military variables. These included physical health indices (i.e., amount of sleep, activity level, body mass index, work days missed due to illness or injury), overall mental health, social isolation, financial and social distress, attitude toward military service, and number of recent stressful life events. As in the unadjusted analyses, spouse referral (adjusted odds ratio (AOR) 6.53, 95% confidence interval (CI) 6.16–6.93), female versus male service member (AOR 0.33, 95% CI 0.31–0.37); and minority race/ethnicity (AOR 0.55, 95% CI 0.50–0.61 for black compared with white non-Hispanic service members) were the strongest correlates of spousal response.

Table 3 Multiple logistic regression for participation in the Millennium Cohort Family Study

Model testing and calculation of final family study weights

The C-statistics for both models (0.764 and 0.768, respectively) indicated acceptable discrimination [53]. The overall mean predicted probability of response for the models was 0.343. For both models, the mean predicted probability of response was markedly higher for responders (0.482–0.485) than for non-responders (0.269–0.270). To choose the optimal non-response model from which to derive the response propensity adjustment for the Family Study weights, we estimated logistic regression models. For these models, we used the same set of independent variables as the two non-response models for each of the selected primary service member measures withheld from the response propensity models (i.e., PHQ potential alcohol dependence, current smoker, former smoker, PHQ major depression, VR-36 physical component, PTSD symptoms using specific criteria, and difficulties with partner). These measures were chosen because they paralleled family member measures which were considered a key outcome for the Family Study, and the service member measure (proxy) was at least modestly correlated with the corresponding family member measure (correlations ranged from 0.109 to 0.492 among responders). For most of the dependent variables, there was a marked improvement in fit between Model 1 and Model 2 as evaluated by changes in the C-statistic, which ranged from 0.629 to 0.958. In particular, for major depression, poor physical health, PTSD symptoms, and partner difficulties, the C-statistic improved by more than 20%. For example, for major depression, the C-statistics for models 1 and 2 were 0.743 and 0.958 respectively. There was little improvement in fit for smoking status (for current smoking, the C-statistic went from 0.730 to 0.746 between Model 1 and Model 2 and for former smoking the C-statistics were 0.629 and 0.634 respectively). Based on these analyses and the higher C-statistic for Model 2, we selected this model to estimate response propensity for adjusting the Family Study weights. Adding the withheld service member outcome measures did not substantively improve the model fit.

The MilCo weights were directly adjusted using the predicted probabilities from Model 2. Next, we again applied raking ratio estimation to rebalance the weights of the population totals. As a final step, the weights were trimmed to eliminate extreme weights and mitigate the variance inflation associated with applying the weights. The trimming of the weights reduced the largest weight by 41% (270.2 to 159.4) and increased the lowest weight by 153% (1.75 to 4.44). After the raking and trimming, weighted estimates for all control margins were still very close to population estimates. As expected the coefficient of variation (CV) of the weights does increase with each successive adjustment, reflecting the increase in variance that is often associated with non-response bias reduction. For Family Study responders, the CV was 0.197 for the design weight, 0.487 for the weight adjusted for MilCo non-response, and 0.973 for the final weight adjusted for Family Study non-response.

Sociodemographic estimates and impact of weights

Table 4 describes sociodemographic characteristics of the Family Study respondents estimated in three ways: unweighted, applying the MilCo weight, and applying the final Family Study weight. We also showed the population estimates derived from military records for the entire sampling frame.

Table 4 Weighted and unweighted estimates for Family Study respondent characteristics (n = 9872)

The Family Study sample included mostly white females, over 90% of whom were under the age of 34. The majority of spouses were partnered with enlisted personnel versus warrant/commissioned officers (91.1 and 9.0%, respectively), and the most commonly represented branches were the Army (50.8%), Air Force (17.4%), and Navy (17.1%). Most spouses were married to an active duty service member (78.9%), and the vast majority had experienced at least one combat-related deployment separation (80.9%) at time of the survey. Nearly 20% of spouses were currently serving or had served in the military.

The difference between unweighted estimates and those weighted with the MilCo weight showed the effect of stratified sampling and Millennium Cohort non-response, while the comparison of unweighted data and those weighted with the Family Study weights showed the combined effects of study design and both stages of non-response. In all cases, final weighted estimates closely mirrored the population estimates which showed the weighting achieved the intended result. For some measures, the combined effects of sampling design and non-response led to the unweighted estimates which were very close to the final weighted estimates (as well as to the population); this was particularly true for gender. Although gender was associated with response at both stages, the combination of non-response bias, along with the oversampling of female service members, resulted in unweighted estimates being very close to those in the targeted population. Similarly, the prevalence of spouses of active duty and Reserve/National Guard service members in the sample was quite similar to their representation in the population. Differences were more striking for age, race/ethnicity, and pay grade, where the unweighted estimates differed quite substantially from the weighted and population values. For example, non-Hispanic blacks were present in the Family Study sample at less than half the prevalence in the population, while those aged 35 and older were nearly twice as prevalent in the sample as in the population. For service branch and component, the effects were intermediate. The standard errors of the estimates generally did not increase dramatically with the adjustment for second-stage non-response.


This study examined correlates of second stage non-response at time of baseline data collection for the Millennium Cohort Family Study in order to provide important context for future study results and to establish a foundation for non-response adjustment. A total of 9,872 spouses participated in the Family Study for an overall response rate of 34.5% (34.3% using the MilCo weight). The probability sample of military spouses included spouses of Reserve/National Guard and dual-military couples, as well as an oversampling of male spouses of service members, all of whom are often under-represented subgroups in military family research. Consistent with the objectives of the Family Study, the majority of couples had experienced at least one combat-deployed separation.

The results of our initial unadjusted non-response analyses showed the majority of service member sociodemographic, military, and administrative variables were significantly associated with greater spousal response, along with various health behaviours, mental health indices, and financial and social issues. Similar to other survey research studies, we found that sociodemographic characteristics and survey administrative factors were most strongly associated with response propensity. In multivariate analyses, the most important correlates of spousal response were service member gender, age, race/ethnicity, education, income, number of dependent children, service branch, and Millennium Cohort completion status. The single strongest predictor of spousal response was if the spouse was referred by the service member for participation. Finally, as is commonly the case in health survey research studies, we found that Family Study respondents were slightly better adjusted, with fewer physical and mental health symptoms, than non-respondents. However, effects associated with health and well-being were small.

The Family Study developed weights to adjust for non-response using a propensity modeling and raking strategy. In support of this, one of the important objectives of this analysis was to choose the optimal logistic regression model from which to derive the response propensity adjustment. We made this selection by applying two alternative models to predict several key service member measures that had been withheld from the models for purposes of validation. For most of these dependent variables, there was a substantial improvement in fit between Model 1 (sociodemographic/administrative variables only) and Model 2 (sociodemographic/administrative, physical, mental health, and stressful life event variables); thus Model 2 was ultimately selected. This ensured variables in the response propensity model fit the two critical criteria for reducing non-response bias-association, both with the likelihood of response and study outcomes of greatest interest. It is also important to note that adding these reserved study outcomes to the response propensity model as a final check did not appreciably improve the fit.

Differential non-response occurs in a military survey when the response rate varies for sample subgroups, typically defined by demographic and service member characteristics. This can often be corrected by making non-response adjustments. However, non-response adjustments can only account for factors that have been measured in administrative records or in the course of the study, and most spousal surveys have limited information on the dyad to assess non-response. The Family Study is uniquely positioned to offer insights into spousal response because there is a wealth of information from the service member to utilize for both spousal responders and non-responders, including physical and mental health indices. It is important to note that despite the inclusion of a multitude of variables in the non-response models in the current study, sociodemographic factors alone performed nearly as well in predicting non-response. Further, although the breadth of variables examined in this study exceeds the typically narrow set of socio-demographic characteristics used to assess non-response in military studies, additional military-related variables could be of interest in future studies such as whether the family lives on base and time on duty station.

Our results suggest future studies should combine targeted oversampling and comprehensive response bias adjustment strategies to address the problem of non-response in population subgroups that are difficult to engage, and should assess sociodemographic factors to help examine and adjust for non-response. For example, since male spouses are a group known to be difficult to recruit, the Millennium Cohort Panel 4 study design oversampled married and female service members in order to better facilitate Family Study enrollment. Due to this oversampling, the final population of male spouses was almost the same as their proportion in the actual population of young spouses despite a substantially lower response rate among males. Ultimately, the combination of oversampling and the application of comprehensive non-response adjustment techniques used in the Family Study provided the opportunity to represent the subgroup of military couples involving a female service member and male spouse more comprehensively than any previous research effort. This type of combined approach would be helpful in future military family research, not only with male spouses, but also spouses of service members who are enlisted, younger (17–24 years old), and black or Hispanic personnel, all of whom are less likely to respond and may require additional or enhanced recruitment strategies.

In addition to oversampling, the Family Study designed a range of targeted recruitment materials to engage participants who may be difficult to enroll. A good match between survey topic and participant characteristics has been found to influence response in prior research [54]; therefore, our study team was concerned with the difficulty in recruiting spouses who did not identify strongly with the “military spouse” or “military family” community. Once again, we were particularly concerned that male spouses may not have a strong military spouse self-schema; dual-military partners also may not identify with this role. The team had further concerns that couples without children may not strongly identify with the “military family schema”. Despite inclusion of a broad definition of military families in study recruitment materials, results indicated some spouses in the sampling frame may have been influenced by a more narrowly defined military family schema in making the choice of whether or not to respond. Furthermore, an examination of the Family Study service member referral process suggested Millennium Cohort participants may have been influenced by similarly narrow military family schemas in deciding whether to encourage their marital partners to engage in the Family Study (McMaster H, Stander V, O’Malley C, Williams C, Woodall K, Bauer L; unpublished observations, manuscript in development). Unfortunately, Millennium Cohort participants were given minimal information regarding the breadth of targeted participation for the Family Study in their introduction to the project. Given that referral was a strong correlate of response likelihood in this study, future dyadic research investigations may improve subgroup representation and overall participant response by enhancing both partners’ understanding of the intended recruitment population. Future military family research should consider that some military spouses may not feel as well integrated into the military community as others, and this may influence response rates.


In summary, our results indicated the Family Study was significantly impacted by a number of non-response factors that commonly affect survey research. In particular, recruiting young, male, and minority populations as well as junior ranking personnel was challenging. Despite this, the Family Study successfully employed multiple strategies to minimize the impact of non-response bias, including a comprehensive propensity modeling approach to develop weights. Our results suggest the success of representative population sampling can be effectively augmented through targeted oversampling and recruitment, as well as more comprehensive survey weighting strategies. In military populations where more extensive information is available documenting characteristics of the total population, future studies could take advantage of available data in adjusting for non-response bias.

Ultimately, the Family Study enrolled a uniquely large dyadic cohort with extensive self-report and military archival data available for all 9872 couples. This is a substantial representation from subgroups that are difficult to engage and frequently excluded in military family research (e.g., male spouses, reservists, dual-military couples). A particular goal was to engage a study panel of junior military personnel and their families. These relatively new members of the military community are an at-risk group, and this young cohort could be followed over the course of their military careers and beyond, capturing critical life events such as divorce as well as separation from the military and associated outcomes. Further, the Family Study participants entered the military community at a time when they likely would be maximally impacted by operations in Afghanistan and Iraq. As such, this program of research presents a critical opportunity to understand the impact of deployment and military life stress on family well-being. This study demonstrates that by conducting comprehensive non-response bias analyses, accounting for a myriad of constructs potentially associated with non-response, and applying weights to adjust for those factors, the data are much more representative of the target population. Although the methodology is not novel in and of itself, this study clearly demonstrates the benefits of non-response modeling and weighting in bias minimization. Ultimately, these weighted data provide the opportunity to generalize to military spouses whose partners had two to five years of military experience as of 2011.

Currently, the study team is exploring deployment-related stressors, as well as mediating factors, influencing outcomes, such as spousal depression, substance use, and marital satisfaction. In future work, we plan to address a full spectrum of issues related to spouse and child physical and mental health, as well as marital and family adjustment. This study will be able to investigate areas that previous military family research has never examined, such as long-term outcomes for families impacted by deployment separations, and longitudinal trajectories of family adjustment over the course of career military service and beyond.



Adjusted odds ratio


Confidence interval


Area under the receiver operating characteristic curve

Family Study:

The department of defense millennium cohort family study

MilCo weight:

Millennium cohort study weight


Patient health questionnaire


Posttraumatic stress disorder


Veterans RAND 36 item health survey


  1. 1.

    Denning LA, Meisnere M, Warner KE, editors. Preventing psychological disorders in service members and their families: an assessment of programs. Washington, DC: National Academies Press; 2014.

    Google Scholar 

  2. 2.

    Eaton KM, Hoge CW, Messer SC, Whitt AA, Cabrera OA, McGurk D, et al. Prevalence of mental health problems, treatment need, and barriers to care among primary care-seeking spouses of military service members involved in Iraq and Afghanistan deployments. Mil Med. 2008;173(11):1051–6.

    Article  PubMed  Google Scholar 

  3. 3.

    Flake EM, Davis BE, Johnson PL, Middleton LS. The psychosocial effects of deployment on military children. J Dev Behav Pediatr. 2009;30(4):271–8.

    Article  PubMed  Google Scholar 

  4. 4.

    Lester P, Peterson K, Reeves J, Knauss L, Glover D, Mogil C, et al. The long war and parental combat deployment: effects on military children and at-home spouses. J Am Acad Child Adolesc Psychiatry. 2010;49(4):310–20.

    PubMed  PubMed Central  Google Scholar 

  5. 5.

    MacDermid Wadsworth SM. Family risk and resilience in the context of war and terrorism. J Marriage Fam. 2010;72(3):537–56.

    Article  Google Scholar 

  6. 6.

    Spera C. Spouses' ability to cope with deployment and adjust to Air Force family demands identification of risk and protective factors. Armed Forces Soc. 2009;35(2):286–306.

    Article  Google Scholar 

  7. 7.

    Obama B. Executive Order 13625. Improving access to mental health services for veterans, service members, and military families. Washington, DC: Office of the Press Secretary, The White House; 2012.

    Google Scholar 

  8. 8.

    Obama B. Strengthening our military families: meeting America’s commitment. Darby: DIANE Publishing; 2011.

    Google Scholar 

  9. 9.

    United States Department of Defense. The Department of Defense plan to achieve the vision of the DoD task force on mental health: report to Congress. Washington, DC: United States Department of Defense; 2007.

    Google Scholar 

  10. 10.

    White CJ, de Burgh HT, Fear NT, Iversen AC. The impact of deployment to Iraq or Afghanistan on military children: a review of the literature. Int Rev Psychiatry. 2011;23(2):210–7.

    Article  PubMed  Google Scholar 

  11. 11.

    Palmer C. A theory of risk and resilience factors in military families. Mil Psychol. 2008;20(3):205–17.

    Article  Google Scholar 

  12. 12.

    Park N. Military children and families: strengths and challenges during peace and war. Am Psychol. 2011;66(1):65–72.

    Article  PubMed  Google Scholar 

  13. 13.

    Tanielian T, Karney BR, Chandra A. Meadows SO; The Arroyo Center and The National Defense Research Institute (US). The Deployment Life Study: methodological overview and baseline sample description. Santa Monica: RAND Corporation; 2014.

    Google Scholar 

  14. 14.

    Crum‐Cianflone NF, Fairbank JA, Marmar CR, Schlenger W. The Millennium Cohort Family Study: a prospective evaluation of the health and well‐being of military service members and their families. Int J Methods Psychiatr Res. 2014;23(3):320–30.

    Article  PubMed  Google Scholar 

  15. 15.

    Lester P, Aralis H, Sinclair M, Kiff C, Lee K-H, Mustillo S, et al. The impact of deployment on parental, family and child adjustment in military families. Child Psychiatry Hum Dev. 2016; doi:10.1007/s10578-016-0624-9.

  16. 16.

    Defense Manpower Data Center. 2010 Military Family Life Project: tabulations of responses. Arlington: Human Resources Strategic Assessment Program; 2011. Accession Number: ADA609601.

    Google Scholar 

  17. 17.

    Defense Manpower Data Center. Military Family Life Project: Active Duty Spouse Study longitudinal analyses 2010–2012 project report. Arlington: Department of Defense, Office of the Deputy Assistant Secretary of Defense for Military Community and Family Policy; 2015.

    Google Scholar 

  18. 18.

    Defense Manpower Data Center. 2006 Survey of Active-Duty Spouses: administration, datasets, and codebook. Arlington: Defense Manpower Data Center; 2007.

    Google Scholar 

  19. 19.

    Groves RM, Peytcheva E. The impact of nonresponse rates on nonresponse bias: a meta-analysis. Public Opin Q. 2008;72(2):167–89.

    Article  Google Scholar 

  20. 20.

    Groves RM, Dilman D, Eltinge JL, Little RJA, editors. Survey nonresponse. New York: John Wiley & Sons; 2001.

    Google Scholar 

  21. 21.

    Eagan TM, Eide GE, Gulsvik A, Bakke PS. Nonresponse in a community cohort study: predictors and consequences for exposure–disease associations. J Clin Epidemiol. 2002;55(8):775–81.

    Article  PubMed  Google Scholar 

  22. 22.

    Pietilä A-M, Rantakallio P, Läärä E. Background factors predicting non-response in a health survey of northern Finnish young men. Scand J Soc Med. 1995;23(2):129–36.

    PubMed  Google Scholar 

  23. 23.

    Maclennan B, Kypri K, Langley J, Room R. Non-response bias in a community survey of drinking, alcohol-related experiences and public opinion on alcohol policy. Drug Alcohol Depend. 2012;126(1–2):189–94.

    Article  PubMed  Google Scholar 

  24. 24.

    Ekholm O, Gundgaard J, Rasmussen NK, Hansen EH. The effect of health, socio-economic position, and mode of data collection on non-response in health interview surveys. Scand J Public Health. 2010;38(7):699–706.

    Article  PubMed  Google Scholar 

  25. 25.

    Fischer EH, Dornelas EA, Goethe JW. Characteristics of people lost to attrition in psychiatric follow-up studies. J Nerv Ment Dis. 2001;189(1):49–55.

    CAS  Article  PubMed  Google Scholar 

  26. 26.

    Dunne MP, Martin NG, Bailey JM, Heath AC, Bucholz KK, Madden PA, et al. Participation bias in a sexuality survey: psychological and behavioural characteristics of responders and non-responders. Int J Epidemiol. 1997;26(4):844–54.

    CAS  Article  PubMed  Google Scholar 

  27. 27.

    Littman AJ, Boyko EJ, Jacobson IG, Horton J, Gackstetter GD, Smith B, et al. the Millenium Cohort Study. Assessing nonresponse bias at follow-up in a large prospective cohort of relatively young and mobile military service members. BMC Med Res Methodol. 2010;10:99.

    Article  PubMed  PubMed Central  Google Scholar 

  28. 28.

    Fear NT, Jones M, Murphy D, Hull L, Iversen AC, Coker B, et al. What are the consequences of deployment to Iraq and Afghanistan on the mental health of the UK armed forces? A cohort study. Lancet. 2010;375(9728):1783–97.

    Article  PubMed  Google Scholar 

  29. 29.

    Iversen A, Liddell K, Fear N, Hotopf M, Wessely S. Consent, confidentiality, and the Data Protection Act. BMJ. 2006;332(7534):165–9.

    Article  PubMed  PubMed Central  Google Scholar 

  30. 30.

    Iversen AC, van Staden L, Hughes JH, Browne T, Hull L, Hall J, et al. The prevalence of common mental disorders and PTSD in the UK military: using data from a clinical interview-based study. BMC Psychiatry. 2009;9:68.

    Article  PubMed  PubMed Central  Google Scholar 

  31. 31.

    Kulka RA, Schlenger WE, Fairbank JA, Hough RL, Jordan BK, Marmar CR, et al. Contractual report of findings from the National Vietnam Veterans Readjustment Study, volume II: table of findings. Durham: Research Triangle Institute; 1988. Accession Number: PB90164211.

    Google Scholar 

  32. 32.

    Zunzunegui M, Beland F, Gutiérrez-Cuadra P. Loss to follow-up in a longitudinal study on aging in Spain. J Clin Epidemiol. 2001;54(5):501–10.

    CAS  Article  PubMed  Google Scholar 

  33. 33.

    Cheung P, Schweitzer I, Yastrubetskaya O, Crowley K, Tuckwell V. Studies of aggressive behaviour in schizophrenia: is there a response bias? Med Sci Law. 1997;37(4):345–8.

    CAS  PubMed  Google Scholar 

  34. 34.

    Vanable PA, Carey MP, Carey KB, Maisto SA. Predictors of participation and attrition in a health promotion study involving psychiatric outpatients. J Consult Clin Psychol. 2002;70(2):362–8.

    Article  PubMed  PubMed Central  Google Scholar 

  35. 35.

    Crum-Cianflone NF. The Millennium Cohort Study: answering long-term health concerns of US military service members by integrating longitudinal survey data with Military Health System records. In: Amara J, Hendricks AM, editors. Military health care: from pre-deployment to post separation. New York: Routledge; 2013. p. 55–77. Accession number: ADA620647.

    Google Scholar 

  36. 36.

    Kroenke K, Spitzer RL, Williams JBW. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606–13.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  37. 37.

    Kazis LE, Lee A, Spiro A, Rogers W, Ren XS, Miller DR, et al. Measurement comparisons of the Medical Outcomes Study and Veterans SF-36® Health Survey. Health Care Financ Rev. 2004;25(4):43–58.

    PubMed  PubMed Central  Google Scholar 

  38. 38.

    Kazis LE, Miller DR, Clark JA, Skinner KM, Lee A, Ren XS, et al. Improving the response choices on the Veterans SF‐36 Health Survey role functioning scales: results from the Veterans Health Study. J Ambul Care Manage. 2004;27(3):263–80.

    Article  PubMed  Google Scholar 

  39. 39.

    Holmes TH, Rahe RH. The social readjustment rating scale. J Psychosom Res. 1967;11(2):213–8.

    CAS  Article  PubMed  Google Scholar 

  40. 40.

    Hobson CJ, Kamen J, Szostek J, Nethercut CM, Tiedmann JW, Wojnarowicz S. Stressful life events: a revision and update of the social readjustment rating scale. Int J Stress Manag. 1998;5(1):1–23.

    Article  Google Scholar 

  41. 41.

    Weathers FW, Litz BT, Herman D, Huska J, Keane T. The PTSD CheckList - civilian version (PCL-C). Boston: National Center for PTSD; 1994.

    Google Scholar 

  42. 42.

    Tedeschi RG, Calhoun LG. The Posttraumatic Growth Inventory: measuring the positive legacy of trauma. J Trauma Stress. 1996;9(3):455–71.

    CAS  Article  PubMed  Google Scholar 

  43. 43.

    Pearlin LI, Schooler C. The structure of coping. J Health Soc Behav. 1978;19(1):2–21.

    CAS  Article  PubMed  Google Scholar 

  44. 44.

    Morin CM, Belleville G, Bélanger L, Ivers H. The Insomnia Severity Index: psychometric indicators to detect insomnia cases and evaluate treatment response. Sleep. 2011;34(5):601–8.

    PubMed  PubMed Central  Google Scholar 

  45. 45.

    Ewing JA. Detecting alcoholism: the CAGE questionnaire. JAMA. 1984;252(14):1905–7.

    CAS  Article  PubMed  Google Scholar 

  46. 46.

    Centers for Disease Control and Prevention. National Health and Nutrition Examination Survey, 1999–2000 data documentation, codebook, and frequencies. Hyattsville: US Department of Health and Human Services, Centers for Disease Control and Prevention; 2002.

    Google Scholar 

  47. 47.

    Centers for Disease Control and Prevention. National Center for Health Statistics. National Health Interview Survey and Health Evaluation Assessment Review. Hyattsville: US Department of Health and Human Services, Centers for Disease Control and Prevention; 2001.

    Google Scholar 

  48. 48.

    Kreuter F, Olson K. Multiple auxiliary variables in nonresponse adjustment. Sociol Methods Res. 2011;40(2):311–32.

    Article  Google Scholar 

  49. 49.

    Izrael D, Battaglia MP, Frankel MR. Extreme survey weight adjustment as a component of sample balancing (a.k.a. raking). In: Proceedings from the 34th Annual SAS Users Group International Conference March 22–25, 2009, Washington, DC. 2009. p. 247.

    Google Scholar 

  50. 50.

    Izrael D, Hoaglin DC, Battaglia MP. A SAS macro for balancing a weighted sample. In: Proceedings from the 25th Annual SAS Users Group International Conference April 9–12, 2000, Indianapolis, IN. 2000. p. 1350–5.

    Google Scholar 

  51. 51.

    David MH, Little R, Samuhel ME, Triest RK. Nonrandom nonresponse models based on the propensity to respond. In: Proceedings of the Business and Economic Statistics Section. Alexandria: American Statistical Association; 1983. p. 168–73.

    Google Scholar 

  52. 52.

    Smith PJ, Rao JNK, Battaglia MP, Ezzati-Rice TM, Daniels D, Khare M. Centers for Disease Control and Prevention, National Center for Health Statistics. Compensating for provider nonresponse using response propensities to form adjustment cells: the National Immunization Survey. Vital Health Stat. 2001;2(133):1–17.

    Google Scholar 

  53. 53.

    Hosmer DW, Lemeshow S. Introduction to the logistic regression model. In: Applied logistic regression. 2nd ed. Hoboken: John Wiley & Sons; 2000. p. 1–30.

    Google Scholar 

  54. 54.

    Groves RM. Nonresponse rates and nonresponse bias in household surveys. Public Opin Q. 2006;70(5):646–75.

    Article  Google Scholar 

Download references


The authors express gratitude to the other contributing members of the Millennium Cohort Family Study Team from Abt Associates, including Christopher Spera, PhD, Alicia Sparks, PhD, and Mariel McLeod as well as members from the Deployment Health Research Department, Naval Health Research Center, including Cynthia LeardMann, MPH; Jackie Pflieger, PhD; Carlos Carballo, MPH; Teresa Powell, MS; Evelyn Sun, MPH; Lauren Bauer, MPH; William Lee; and Steven Speigle. The authors gratefully acknowledge the members of the Millennium Cohort Family Study Team from the Center for Child and Family Health including Robert Murphy, PhD; John Fairbank, PhD; Ernestine Briggs-King, PhD; Ellen Gerrity, PhD; and Robert Lee, MS. The authors would also like to acknowledge members of the Millennium Cohort Family Study Team from New York University, including Maria Steenkamp, PhD and Charles Marmar, MD. In addition, the authors want to express their gratitude to the Family Study participants, without whom this study would not be possible.

Disclaimer: I am a military service member (or employee of the US Government). This work was prepared as part of my official duties. Title 17, U.S.C. §105 provides the “Copyright protection under this title is not available for any work of the United States Government.” Title 17, U.S.C. §101 defines a US Government work as work prepared by a military service member or employee of the US Government as part of that person’s official duties.

Report No. 16-XX supported by the U.S. Army Medical Research and Material Command under work unit no. N1240. The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Department of the Navy, Department of the Army, Department of the Air Force, Department of Veterans Affairs, Department of Defense, or the U.S. Government. Approved for public release; distribution unlimited.

Human subjects participated in this study after giving their free and informed consent. This research has been conducted in compliance with all applicable federal regulations governing the protection of human subjects in research (Protocol NHRC.2015.0019).


This work was funded under Contract #W911QY-15-C-0002, supported by the Naval Health Research Center.

Availability of data and materials

The datasets analysed during the current study are not publicly available; deidentified data are available upon the establishment of a Department of Defense data use agreement.

Authors’ contributions

NC assisted in the development of the analytic plan, led the drafting of the manuscript, coordinated the study, and helped interpret findings. CW led the development of the analytic plan, analysed and interpreted the non-response and weighting data and was a major contributor in writing the manuscript, particularly the Results section. MB provided subject matter expertise on complex survey design non-response and weighting analyses to help develop the analytic plan and interpret the data and reviewed the manuscript. HM provided input on the analytic plan, helped interpret findings, and contributed to drafting manuscript sections, including the Methods and Discussion. VS provided input on the analytic plan, helped interpret findings, and provided critical feedback and revisions to the manuscript. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

The study was overseen and approved by the Naval Health Research Center’s Institutional Review Board (Protocol 2000.0007) and the Office of Management and Budget (approval number 0720–0029). Informed consent was obtained for all participants. This research has been conducted in compliance with all applicable federal regulations governing the protection of human subjects in research (Protocol NHRC.2015.0019).

Author information



Corresponding author

Correspondence to Nida H. Corry.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Corry, N.H., Williams, C.S., Battaglia, M. et al. Assessing and adjusting for non-response in the Millennium Cohort Family Study. BMC Med Res Methodol 17, 16 (2017).

Download citation


  • Nonresponse bias
  • US military
  • Dyadic recruitment
  • Propensity score modeling
  • Survey weighting