Skip to main content
  • Research article
  • Open access
  • Published:

How can we get Iraq- and Afghanistan-deployed US Veterans to participate in health-related research? Findings from a national focus group study

Abstract

Background

Research participant recruitment is often fraught with obstacles. Poor response rates can reduce statistical power, threaten both internal and external validity, and increase study costs and duration. Military personnel are socialized to a specific set of laws, norms, traditions, and values; their willingness to participate in research may differ from civilians. The aims of this study were to better understand the views of United States (US) Veterans who served in Operation Enduring Freedom (OEF)/ Operation Iraqi Freedom (OIF) on research and motivators for participating in research to inform recruitment for a planned observational study of respiratory health in OEF/OIF Veterans.

Methods

We conducted 10 focus groups in a purposive sample of OEF/OIF Veterans (n = 89) in five US cities in 2015. Key topics included: reasons for participating or declining to participate in health-related research, logistics around study recruitment and conduct, compensation, written materials, and information sharing preferences for study results. Two authors independently coded the data using template analysis.

Results

Participants identified three criteria that motivated a decision to participate in health-related research: 1) adequate compensation, 2) desire to help other Veterans, and 3) significance and relevance of the research topic. For many, both sufficient compensation and a sense that the study would help other Veterans were critical. The importance of transparency arose as a key theme; Veterans communicated that vague language about study aims or procedures engendered distrust. Lastly, participants expressed a desire for studies to communicate results of their specific health tests, as well as overall study findings, back to research participants.

Conclusions

OEF/OIF Veterans described trust, transparent communication, and respect as essential characteristics of research in which they would be willing to participate. Additional studies are needed to determine whether our results generalize to other US Veterans; nevertheless, our results highlight precepts that have been reported as important for recruitment in other populations. Researchers may benefit from using community-engaged research methods to seek feedback on recruitment materials and strategies prior to initiating research. For costly studies targeting a large sample (i.e. in the thousands), it may be important to test a variety of recruitment strategies.

Peer Review reports

Background

Recruitment of research participants is often fraught with obstacles [1] and is one of the largest costs associated with conducting trials and observational studies [2]. Prior studies have indicated that 65–70% of trials do not meet their original sample size target, and more than half extend the length of the recruitment period to reach targets [3,4,5,6]. Failure to enroll the pre-specified sample size is likely to result in insufficient power. Even when sample size goals are met, participation must be independent of the exposure and outcome under study for the research to obtain unbiased results (i.e., high internal validity). High response rates are also important in terms of generalizability (i.e., external validity). Lastly, high response rates are necessary to keep studies on time and budget.

United States (US) military personnel are socialized to a specific set of laws, norms, traditions, and values [7, 8]. Defining characteristics include honor, bravery, personal sacrifice; and commitment to duty, mission, and fellow unit members [7]. Fellow unit members often form strong bonds with each other due to their shared experiences. Authors have noted that active duty service members and Veterans constitute a distinct subculture because of these differences [9]. Furthermore, those who serve in the military are predominantly male--though the number of women serving has substantially increased since 2001-- and more likely to come from the working class than the civilian population [10]. All US Veterans who were discharged from the military under any condition other than ‘dishonorable’ may qualify for Department of Veterans Affairs (VA) benefits; those who were injured during service or fall below a certain income are given the highest priority and may receive healthcare at no charge [11]. Notably, VA reserves the right to reassess disability claims and adjust ratings and benefits for conditions that are not considered permanent, such as posttraumatic stress disorder and back pain. Additionally, unlike US civilians for whom the primary research funding agency (i.e., National Institutes of Health) is separate from their health care provider and insurer, determination of eligibility for benefits, clinical care for Veterans, and much of the research on Veterans’ health are all housed within and funded by the US Department of Veterans Affairs.

Consequently, US Veterans’ willingness to participate in research may differ from civilians, and different approaches to study recruitment may be needed. There is a paucity of research into best practices for recruiting Veterans for research studies [12,13,14,15,16,17], particularly with respect to those who served in OEF/OIF, research involving observational cohort studies (vs. clinical trials), and for Veterans who are not using VA for their health care. Prior studies in Veterans have noted altruism [17] as an important factor for recruitment whereas difficulty contacting target participants (possibly related to the fact that younger Veterans are highly mobile and military contact information may be outdated) [13], lack of interest [13], and research “burn-out” due to multiple requests [16] were barriers to participation. Bayley et al. hypothesized that participation in their study may have been low due to a low perceived compensation rate ($200 for a full day of assessments) [13].

Prior to the launch of a study of the long-term impacts of open-air waste burning (“burn pits” [18]) on the pulmonary health of military service members, we undertook a formative study using focus group methodology (described in Methods) to maximize the likelihood of successful and timely recruitment. The parent study is investigating associations between land-based deployment and pulmonary health among Veterans who served in OEF/OIF recruited from six metropolitan areas, with a target sample of 4800 participants. Participants attend in-person assessments to measure pulmonary function, height and weight and complete self- and interviewer-administered questionnaires (estimated to take 2–3 h). While the primary outcome (pulmonary function) and exposure (particulate matter) are measured objectively via lung function testing and satellite-based methods, respectively, the investigators were concerned about disproportionate recruitment of individuals who had pulmonary symptoms. Thus, the planned approach (and that tested in the focus groups) was to describe the study with broad aims to elicit interest from a range of Veterans.

The aims of this formative focus group study were to better understand the views of Veterans eligible for the parent study-- US Veterans who served in OEF/OIF -- on research and identify motivators for participating in research studies. The current paper reports findings that generalize to other studies targeting US OEF/OIF Veterans and that may also be relevant to research among Veterans of other eras and non-Veterans.

Methods

Between September and November 2015, we held 10 focus groups in 5 cities (see Table 1). Cities in four of the five VA regions (Southeast, West, Midwest, and Continental) were selected. We aimed to include 6–12 participants per group, an optimal size to encourage discussion and ensure everyone’s voice is heard [19]. To identify potential participants, we used the Defense Manpower Data Center, which collates personnel and other data for the Department of Defense and includes a complete list of all discharges after 1972 [20].

Table 1 Focus group locations, number of attempted contacts per location, and characteristics of each group

Inclusion/exclusion criteria

To be eligible for inclusion, a Veteran had to have served in the US Armed Forces (including Guard or Reserves) between October 1, 2001 and December 31, 2012, had at least one deployment to Iraq, Afghanistan, Kuwait, Qatar, Djibouti, United Arab Emirates, or Kyrgyzstan, and completed their active duty military service as of December 31, 2012. Additionally, we only contacted those who lived within 50 miles of the zip code where the focus groups took place based on address information obtained from a commercial vendor. Exclusion criteria included serving only in the Navy and Coast Guard due to the nature of the parent study, which was focused on land exposures during deployments.

Recruitment of study participants

A random sample of individuals who were eligible were mailed a packet including an introductory letter inviting them to participate in the study approximately 3–4 weeks before each focus group. The letter including names and phone numbers of investigators and was mailed in a large manila envelope. Non-responders were phoned approximately 1 week later unless the group was already filled or the Veteran notified staff that they were not interested in participating. Two days prior to the focus groups, staff telephoned participants to remind them of the time and location and to answer questions.

Focus group methodology

In collaboration with the focus group moderator, the investigators and other research staff members developed a focus group guide (Additional file 1). Key topics included: reasons for participating or declining to participate in health-related research, logistics around study recruitment (e.g., preferred mode of contact), acceptable time commitment for surveys and in-person visits, compensation, study logistics (e.g., location), written materials, and sharing information about study results.

Sessions were led by a professional, doctoral-level moderator (TW) who was not a VA employee. To increase the comfort of participants and encourage safe and open dialogue, we held some women-only and enlisted-only focus groups. Some of the enlisted-only groups included only men; mixed-gender groups included at least four women. Focus groups were held in a conference room in a hotel in a central location that was easy to access via car and that had ample, free parking. Total on-site time commitment was approximately 2 h. This included approximately 15 min for informed consent procedures before the focus group, 90 min for the focus group, and approximately 10 min to complete a 7-page survey immediately following the focus group. The survey collected information on demographics, health status, receipt of VA services and/or benefits, military service, and perceptions of the VA and VA research.

To help ground the discussion of recruitment materials, participants were provided with examples of a contact letter and key study procedure information (See Additional file 2). All focus groups were audio recorded. TW summarized findings after each focus group. One team member (AL) attended four of the focus groups and took notes during the sessions. Individuals were compensated $150 (in cash, given on site) for their participation. All study procedures were approved by the VA Central Institutional Review Board.

Analysis

We analyzed the data using template analysis [21, 22]. The initial template was organized into domains based on focus group questions, and included space for noting majority opinions, exemplar quotes, and outlier responses. Two team members (AL and EA) listened to audio-recordings to evaluate the template for usability and relevance; the template was modified based on initial use and review. After consistency was established, AL and EA independently coded six of the ten focus groups. Exemplar quotes were transcribed verbatim. Summary points were transferred into a matrix (e.g., organized by domain × focus group [23]). Microsoft Word was used for data management. After coding of the six focus groups, notes taken by AL for the remaining 4 and by TW for all 10 were evaluated for consistency with the six templates created by AL and EA. It was not a study objective to examine differences in views and opinions based on gender, officer status or location; thus, findings were analyzed without respect to these characteristics.

Results

A total of 89 individuals participated in focus groups. On average, participants were 38 years of age (range: 26–66 years), 29% were female, 23% were Hispanic, 55% had a Bachelor’s degree or a professional degree, 21% were officers, and 56% reported their health to be excellent or very good (Table 2). Less than half (46%) used VA health care benefits; over 75% thought that VA benefit services were excellent, very good, or good.

Table 2 Characteristics of Veterans who participated in 10 focus groups conducted in 5 US cities (n = 89)

Factors affecting participation in research

Table 3 details key factors, comments, and quotes related to factors affecting participation in research. Key findings are briefly highlighted here. The key determinants of participating in health-related research were three-fold: 1) receipt of adequate compensation for participation, 2) “duty, honor, and doing the right thing” -- a desire to fulfill an obligation to help other Veterans, and 3) perception of the research topic as relevant and important. For many, both sufficient compensation and a sense that the study would help other Veterans were critical.

Table 3 Key findings related to factors affecting participation in research

Considerations regarding the relative costs (e.g., inconvenience, time away from work and family) and risks related to privacy and information security, losing VA benefits, and study participation (e.g., experimental drugs) were important potential deterrents to participating in a research study. Before considering participation, Veterans needed assurance that the study was legitimate and not a “scam.” Several participants stated that they conducted an internet search of the names and phone numbers of investigators listed in the invitation letter. While not a sufficient reason for participation, interacting with study personnel who were professional, courteous, knowledgeable was also noted as being helpful.

Initial approach by postal mail using a large envelope followed by phone call is best

Participants generally endorsed the method that was used to recruit them into the focus group study (which involved a mailed letter followed by a phone call) compared with an initial approach via email or phone. Letters delivered via postal mail were preferred because it was easy to recognize military materials and was perceived as more “official” than email and less intrusive than a phone call. One participant stated, “I don’t like phone calls. Especially not survey phone calls.” Many individuals stated that they opened the letter for the current study because the envelope “stuck out”, seemed important and “official” (because of the emblem). Legitimacy concerns were heighted with email; words such as “phishing” and “scams” were used to describe reactions to an initial study approach via email. Overall, Veterans said they “wouldn’t have paid much attention” to an email invitation.

Whereas initial contact by phone was not recommended, follow up via phone was acceptable and appreciated by many. The participants noted, however, that they were unlikely to answer the phone, so it was essential that the study recruiter leave a message. One person stated, “When I see a number and I don’t know it, I won’t answer it. But if you leave a message, I’ll call back.”

Introductory letter should contain certain information and wording

During the focus groups, participants were asked to review and provide feedback on an example letter informing them about a study and inviting them to participate (Additional file 2). Table 4 summarizes the findings regarding key issues raised and recommendations about how to address them in written communication. Key overarching themes were related to transparency and trust. Vague language was off-putting. Many participants felt that the letter did not specify why the study was being done, which heightened their suspicion. One participant stated, “Tell us what you are looking for. It feels like it is hidden.” Another participant added, “What the hell did I pick up that they know about that I don’t?” In response to how the letter could be rewritten, one participant said, “Saying something like, ‘Troops are coming back with respiratory problems; we want to know why’ might be helpful.” Vague terms were also unpopular. The letter mentioned that the in-person examination will require that patients take a “bronchodilator.” One participant said, “Bronchodilator? It sounds invasive. Why? What is the reason behind it? If they gave me a valid reason for doing it, I’d be on board.” Other individuals worried that it was an experimental drug.

Table 4 Problems/concerns and recommendations about how to address them in writing or orally to potential participants

Individuals generally preferred a less wordy style, as explained by one person, “People in the military like concise information. Brief.” Another participant added, “Keep it to Who? What? When? Where? Why? and How?” Several participants read the letter that explained that the hypothetical study 2–3-h study visit and could not understand how the activities listed (filling out some questionnaires, “blowing into a tube”, and getting height, weight, and pulse measured) could take that long. They assumed that the 2–3 h would involve waiting. However, when the moderator clarified that the 2–3 h would not involve waiting, the feedback we received about the time commitment was generally positive; most participants said that 2–3 h was acceptable provided that they felt that the study team was well-prepared and their time was well spent.

Logistics for face-to-face study visits that are important

The investigators were concerned that locating study procedures at a VA medical center would dissuade Veterans who do not get their health care from the VA from participating. This concern was not borne out, though a few participants mentioned that they would not want to go to a VA medical center because “you don’t know what you’re going to walk into in the waiting room,” referring to the potentially disturbing experience of seeing disabled or sick Veterans. Nevertheless, only one person stated that she would not participate if required to go to a VA. Many saw locating study activities at a VA medical center as an advantage because it was familiar, they knew how to get around, it established legitimacy, and they assumed that they would be taken care of by people who understand Veterans. The cons related to difficulty finding parking, concerns about inefficiencies (the need to wait during their study visit), and “taking appointments from sicker Vets.” The third point revealed an important misunderstanding – that some Veterans do not distinguish research from clinical activities. While some found the idea of preferred parking at a VA medical center attractive, others were concerned that they would be taking parking away from Veterans who were older or more disabled than they were.

Questionnaire content, length, administration

Individuals’ initial response to acceptable questionnaire length was 15–30 min, because with longer questionnaires, “you get bored” and “lose interest.” Another noted that “the longer the questionnaire, the less accurate you become. Focus drifts away.” While for some, compensation would provide motivation to fill out longer/boring questionnaires, others stated that it would need to be clear how the information was relevant. Many expressed a preference for completing a survey on the computer, while a few expressed concerns about data security if done on the web. Others expressed sentiments that implied that they thought that we (VA researchers) should already have the information, because the information was in their Department of Defense or VA record. To illustrate this perspective, one Veteran explained, “Nothing frustrates me more than having to enter duplicate info. All of that stuff is in the system already. Why are they asking us?” Participants stated that they would be more willing to complete a longer survey if they had a few weeks to complete the survey, so that they could choose a time to work on it that was convenient for them, spreading out the work over more than one session.

Compensation for participation

Expectations for compensation for future studies mirrored that offered for participation in the focus group (e.g., $50–75/h, depending on whether individuals included travel time). For others, compensation was not important. One person said, “If it was considered valuable, I would do it for free.” Nevertheless, this was a minority opinion and for many, the compensation was important. Generally, participants preferred cash or check on site rather than having to wait for a mailed check because of concerns that the paperwork or the check might be lost in the mail/system. If the compensation could not be given in person at the time of participation, it was important that it be provided within a reasonable amount of time (2–3 weeks). While a debit card with no fees was considered acceptable, a gift card to a specific store was the least preferred mode. Alternative methods proposed by the participants included direct deposit and in-kind compensation, for example priority processing of claims.

Data sharing

Individuals were interested in obtaining reports of overall study results because it would “give us the value of what we invested our time in.” Furthermore, it might increase the likelihood of individuals participating in future studies. One participant explained, “A lot of research studies don’t follow back up with you and say, ‘this is what we got from the research.’ That would be very helpful. That may make me want to do another one.” In addition to knowing how the study may have helped others, they wanted information about their own health. Like when they visited their doctor, individuals expected to get a copy of their results so that they could better understand their health status and risk factors. They also expressed a desire to share the study information with their doctor. Participants wanted the report to be easy to understand quickly and to provide basic information like “Are you in the normal or abnormal range?” They were also interested in seeing how their measures compared to others who were like them (e.g., in terms of age, service era, deployments, etc.).

Discussion

Trust, transparency, and respect were themes than ran through our findings on why OEF/OIF US Veterans participate in research and how they wish to be treated as research participants. Singer and Ye concluded that there are three primary reasons why individuals participate in research: 1) altruism/norms of cooperation, 2) egoistic/self-interest (e.g., enjoys surveys, would benefit, interested in learning something new and “the money”, and 3) responses mentioning one or more survey characteristics (e.g., interest in the topic, respect for organizations, length of survey) [24]. While these reasons generally align with those observed in the current study, US Veterans’ military socialization, shared experiences, close bonds with fellow Veterans, and relationships with the VA differentiate them from civilians and may impact their perspective on research participation. Findings from this study reinforce the importance of working to establish trust. Similar to vulnerable populations such as indigenous people [25], other minority populations [26,27,28,29], and frail elderly adults [30], due to a history of harassment, assault, or other trauma [7], trust and safety concerns are more prevalent among military Veterans, potentially making recruitment more challenging. For US Veterans to participate, they must perceive that the researchers are trustworthy and that the research is of value. Trust will be enhanced by being transparent -- clearly stating the study’s purpose and intended impact(s) of the research, consistent with previous research in other populations [31]. Vague purpose statements were perceived as disingenuous and diminished trust. Veterans also wanted to be reassured that the research was legitimate, and the research team will be good stewards of their data. Providing information on the study team can facilitate information gathering efforts by potential participants.

Respect is also a core military value [7]; several barriers and facilitators to participation related to displays of respect. To demonstrate respect, researchers should consider providing generous compensation; preparing an agenda for face-to-face study visits; and being efficient, professional, and on time. Also, as observed in prior studies [32,33,34], US Veterans reported wanting to receive results of study-related tests and study findings. While data sharing may not directly impact willingness to participate in the short term, it demonstrates respect and appreciation for the study participants and might motivate participation in future studies.

Our study yielded other interesting findings that we wish to highlight. Many Veterans misunderstood how the Department of Defense, Veterans Benefits Administration, clinical services managed by the VA, and VA-funded research were connected. Some assumed that they had already signed waivers for data sharing, making it possible for researchers to obtain military records. Conversely, they feared that information given in a research study might be shared with other entities and affect their benefits. Memoranda of understanding regarding data sharing are difficult to understand and often have broad provisions such that misunderstandings are likely [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.473.3931&rep=rep1&type=pdf]. Likewise, data sharing clauses in research consent documents are often wide-ranging and provisional. Determining ways to more clearly and simply communicate this information may help to avoid misunderstandings in the future. Furthermore, unlike a prior study [16] that emphasized the value of employing military members as research staff (who have an “insider” perspective), participants responded positively to our staff, none of whom were Veterans.

Lastly, our study revealed that relatively small changes, like using a large envelope instead of a small one may help avoid mail being tossed without being read. We are aware of only one study [35] that examined response rates for a large vs. small envelope; that study found no difference, though the subjects were physicians.

Several limitations should be considered when evaluating our results. First, we were unable to obtain information about perceptions of research among those who did not volunteer to participate in our research study. Nevertheless, most of our study participants had never participated in a health research study, suggesting that we may have included individuals who had previously declined participating in research. It is possible that the generous compensation motivated Veterans who might have been otherwise unlikely to participate. In support of this hypothesis, some studies indicate that higher incentives results in higher response rates [36] though a recent study in the Netherlands in people with type 2 diabetes observed that response rates were lower among those offered a 12.5-euro incentive compared to a 7.5 euro incentive [37]. Furthermore, some [38, 39] but not all [37] studies indicate that incentives may improve representativeness of the study sample because incentives have a relatively stronger effect in socio-demographic groups with a relatively lower response rate. Second, we are unable to calculate a response rate because we closed the focus groups once the target sample size was reached. Though not a limitation, we oversampled from certain groups (e.g., women and officers); thus, it is not possible to compare demographic characteristics of our participants to those of the military in general. Furthermore, as a goal of qualitative research is to obtain a variety of perspectives, samples are often not a simple cross-section of the population of interest. This fact is true of participants in this study. Third, opinions about how best to recruit individuals into a research study and compensation may have been impacted by the fact that we recruited the sample by mail and provided the compensation we did. Lastly, we recruited those living relatively close to urban centers, so findings may not generalize to rural Veterans.

Conclusions

Our study provides practical information to aid investigators in avoiding pitfalls that may hinder recruitment and incorporating information that may help motivate individuals to participate. Many of our findings align with principles for patient-centered outcomes research, which includes trust, honesty, co-learning, transparency, and respect [40]. We encourage researchers to partner with members of their target population to get feedback on recruitment materials and methods. Engaging patients and other stakeholders in the conduct of research can ensure that the research and its results are patient-centered, relevant to the intended users of the research findings, and that the findings can be effectively disseminated [40].

Abbreviations

OEF/OIF:

Operation Enduring Freedom/Operation Iraqi Freedom

US:

United States of America

VA:

Department of Veterans Affairs

VHA:

Department of Veterans Health Affairs

References

  1. Preloran HM, Browner CH, Lieber E. Strategies for motivating Latino couples' participation in qualitative health research and their effects on sample construction. Am J Public Health. 2001;91(11):1832–41.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  2. Emanuel EJ, Schnipper LE, Kamin DY, Levinson J, Lichter AS. The costs of conducting clinical research. J Clin Oncol. 2003;21(22):4145–50.

    Article  PubMed  Google Scholar 

  3. McDonald AM, Knight RC, Campbell MK, Entwistle VA, Grant AM, Cook JA, Elbourne DR, Francis D, Garcia J, Roberts I, et al. What influences recruitment to randomised controlled trials? A review of trials funded by two UK funding agencies Trials. 2006;7:9.

    PubMed  Google Scholar 

  4. Charlson ME, Horwitz RI. Applying results of randomised trials to clinical practice: impact of losses before randomisation. Br Med J (Clin Res Ed). 1984;289(6454):1281–4.

    Article  CAS  Google Scholar 

  5. Haidich AB, Ioannidis JP. Patterns of patient enrollment in randomized controlled trials. J Clin Epidemiol. 2001;54(9):877–83.

    Article  PubMed  CAS  Google Scholar 

  6. Foy R, Parry J, Duggan A, Delaney B, Wilson S, Lewin-van Den Broek NT, Lassen a, Vickers L, Myres P. How evidence based are recruitment strategies to randomized controlled trials in primary care? Experience from seven studies. Fam Pract. 2003;20(1):83–92.

    Article  PubMed  CAS  Google Scholar 

  7. Coll JE, Weiss EL, Yarvis JS. No one leaves unchanged: insights for civilian mental health care professionals into the military experience and culture. Soc Work Health Care. 2011;50(7):487–500.

    Article  PubMed  Google Scholar 

  8. Abraham T, Cheney AM, Curran GM. A Bourdieusian analysis of U.S. military culture ground in the mental help-seeking literature. Am J Mens Health. 2017;11(5):1358–65.

    Article  PubMed  Google Scholar 

  9. Reger MA, Etherage JR, Reger GM, Gahm GA. Civilian psychologists in an Army culture: the ethical challenge of cultural competence. Mil Psychol. 2008;20(1):21–35.

    Article  Google Scholar 

  10. Lutz A. Who joins the military? A look at race, class, and immigration status. Journal of Political and Military Sociology. 2008;36(2):167–88.

    Google Scholar 

  11. Strom TQ, Leskela J, Gavian ME, Possis E, Loughlin J, Bui T, Linardatos E, Siegel W. Cultural and ethical considerations when worknig with military personnel and veterans: a primer for VA training programs. Training and Education in Professional Psychology. 2012;6(2):67–75.

    Article  Google Scholar 

  12. Pedersen ER, Helmuth ED, Marshall GN, Schell TL, PunKay M, Kurz J. Using facebook to recruit young adult veterans: online mental health research. JMIR Res Protoc. 2015;4(2):e63.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Bayley PJ, Kong JY, Helmer DA, Schneiderman A, Roselli LA, Rosse SM, Jackson JA, Baldwin J, Isaac L, Nolasco M, et al. Challenges to be overcome using population-based sampling methods to recruit veterans for a study of post-traumatic stress disorder and traumatic brain injury. BMC Med Res Methodol. 2014;14:48.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Finley EP, Mader M, Bollinger MJ, Haro EK, Garcia HA, Huynh AK, Pugh JA, Pugh MJ. Characteristics associated with utilization of VA and non-VA care among Iraq and Afghanistan veterans with post-traumatic stress disorder. Mil Med. 2017;182(11):e1892–903.

    Article  PubMed  Google Scholar 

  15. Roberge E, Benedek D, Marx C, Rasmusson A, Lang A. Analysis of recruitment strategies: enrolling veterans with PTSD into a clinical trial. Mil Psychol. 2017;29(5):407–17.

    Article  Google Scholar 

  16. Bush NE, Sheppard SC, Fantelli E, Bell KR, Reger MA. Recruitment and attrition issues in military clinical trials and health research studies. Mil Med. 2013;178(11):1157–63.

    Article  PubMed  Google Scholar 

  17. Campbell HM, Raisch DW, Sather MR, Warren SR, Segal AR. A comparison of veteran and nonveteran motivations and reasons for participating in clinical trials. Mil Med. 2007;172(1):27–30.

    Article  PubMed  Google Scholar 

  18. Institute of Medicine. Long-term Health Consequences of Exposure to Burn Pits in Iraq and Afghanistan. Washington: The National Academies Press; 2011.

    Google Scholar 

  19. Morgan D. The focus group guidebook (focus group kit). 1st edition edn. Thousand Oaks: SAGE Publications, Inc; 1997.

    Google Scholar 

  20. Defense Manpower Data Center [https://www.dmdc.osd.mil/appj/dwp/index.jsp].

  21. Patton M. Qualitative Research & Evaluation Methods. 3rd edition edn. Thousand Oaks: Sage Publications; 2002.

    Google Scholar 

  22. Miles MB, Huberman AM. Qualitative data analysis : an expanded sourcebook. 2nd edn. Thousand Oaks: Sage Publications; 1994.

    Google Scholar 

  23. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855–66.

    Article  PubMed  Google Scholar 

  24. Singer E, Ye C. The use and effects of incentives in surveys. The ANNALS of the American Academy of Politicial and Social Science. 2013;645:112–41.

    Article  Google Scholar 

  25. Glover M, Kira A, Johnston V, Walker N, Thomas D, Chang AB, Bullen C, Segan CJ, Brown N. A systematic review of barriers and facilitators to participation in randomized controlled trials by indigenous people from New Zealand, Australia, Canada and the United States. Glob Health Promot. 2015;22(1):21–31.

    Article  PubMed  Google Scholar 

  26. Barrett NJ, Ingraham KL, Vann Hawkins T, Moorman PG. Engaging African Americans in research: the Recruiter's perspective. Ethn Dis. 2017;27(4):453–62.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Martinez P, Cummings C, Karriker-Jaffe KJ, Chartier KG. Learning from Latino voices: focus Groups' insights on participation in genetic research. The American journal on addictions / American Academy of Psychiatrists in Alcoholism and Addictions. 2017;26(5):477–85.

    Google Scholar 

  28. Hughes TB, Varma VR, Pettigrew C, Albert MS. African Americans and clinical research: evidence concerning barriers and facilitators to participation and recruitment recommendations. Gerontologist. 2017;57(2):348–58.

    Article  PubMed  Google Scholar 

  29. Otado J, Kwagyan J, Edwards D, Ukaegbu A, Rockcliffe F, Osafo N. Culturally competent strategies for recruitment and retention of African American populations into clinical trials. Clinical and translational science. 2015;8(5):460–6.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Provencher V, Mortenson WB, Tanguay-Garneau L, Belanger K, Dagenais M. Challenges and strategies pertaining to recruitment and retention of frail elderly in research studies: a systematic review. Arch Gerontol Geriatr. 2014;59(1):18–24.

    Article  PubMed  Google Scholar 

  31. Glass DC, Kelsall HL, Slegers C, Forbes AB, Loff B, Zion D, Fritschi L. A telephone survey of factors affecting willingness to participate in health research surveys. BMC Public Health. 2015;15:1017.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  32. Shalowitz DI, Miller FG. Disclosing individual results of clinical research: implications of respect for participants. JAMA. 2005;294(6):737–40.

    Article  PubMed  CAS  Google Scholar 

  33. Fernandez CV, Ruccione K, Wells RJ, Long JB, Pelletier W, Hooke MC, Pentz RD, Noll RB, Baker JN, O'Leary M, et al. Recommendations for the return of research results to study participants and guardians: a report from the Children's oncology group. J Clin Oncol. 2012;30(36):4573–9.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Long CR, Stewart MK, Cunningham TV, Warmack TS, McElfish PA. Health research participants' preferences for receiving research results. Clin Trials. 2016;13(6):582–91.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Halpern SD, Ubel PA, Berlin JA, Asch DA. Randomized trial of 5 dollars versus 10 dollars monetary incentives, envelope size, and candy to increase physician response rates to mailed questionnaires. Med Care. 2002;40(9):834–9.

    Article  PubMed  Google Scholar 

  36. Edwards PJ, Roberts I, Clarke MJ, Diguiseppi C, Wentz R, Kwan I, Cooper R, Felix LM, Pratap S. Methods to increase response to postal and electronic questionnaires. The Cochrane database of systematic reviews. 2009;3:MR000008.

    Google Scholar 

  37. Koetsenruijter J, van Lieshout J, Wensing M. Higher monetary incentives led to a lowered response rate in ambulatory patients: a randomized trial. J Clin Epidemiol. 2015;68(11):1380–2.

    Article  PubMed  Google Scholar 

  38. Olsen F, Abelsen B, Olsen JA. Improving response rate and quality of survey data with a scratch lottery ticket incentive. BMC Med Res Methodol. 2012;12:52.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Ryu E, Couper MP, Marans RW. Survey incentives: cash vs. in-kind; face-to-face vs. mail; respones rate vs. nonresponse. International Journal of Public Opinion Research. 2005;18(1):89–106.

    Article  Google Scholar 

  40. Frank L, Forsythe L, Ellis L, Schrandt S, Sheridan S, Gerson J, Konopka K, Daugherty S. Conceptual and practical foundations of patient engagement in research at the patient-centered outcomes research institute. Qual Life Res. 2015;24(5):1033–41.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Department of Veterans Affairs or the US Government.

Funding

This work was supported by the United States Department of Veterans Affairs Cooperative Studies Program (CSP #595 Pilot – Respiratory Health and Deployment to Iraq and Afghanistan Pilot).

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available due to the fact that they contain information that could compromise research participant privacy/consent but are available from the corresponding author (AL) on reasonable request.

Author information

Authors and Affiliations

Authors

Contributions

AJL had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Concept and design: AJL, TW, NLS. Acquisition, analysis, or interpretation of data: AJL, GT, EA, TW, NLS. Drafting of the manuscript: AJL. Critical revision of the manuscript for important intellectual content: AJL, GT, EA, TW, NLS. Obtained funding: NLS. All authors have read and approved the final manuscript.

Corresponding author

Correspondence to Alyson J. Littman.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Institutional Review Board of VA Puget Sound Health Care System (MIBR #00676). All participants provided written informed consent.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Focus group guide. Focus group session timing, welcome/ground rules/introductions, and topic introductory text and prompts. (PDF 326 kb)

Additional file 2:

Example letter given to participants at focus groups. Example of a contact letter and key study procedure information, annotated in red font to reflect how participants perceieved the letter. (PDF 97 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Littman, A.J., True, G., Ashmore, E. et al. How can we get Iraq- and Afghanistan-deployed US Veterans to participate in health-related research? Findings from a national focus group study. BMC Med Res Methodol 18, 88 (2018). https://doi.org/10.1186/s12874-018-0546-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12874-018-0546-2

Keywords