Research article | Open | Open Peer Review | Published:
Practical application of opt-out recruitment methods in two health services research studies
BMC Medical Research Methodologyvolume 17, Article number: 57 (2017)
Participant recruitment is an ongoing challenge in health research. Recruitment may be especially difficult for studies of access to health care because, even among those who are in care, people using services least often also may be hardest to contact and recruit. Opt-out recruitment methods (in which potential participants are given the opportunity to decline further contact about the study (opt out) following an initial mailing, and are then contacted directly if they have not opted out within a specified period) can be used for such studies. However, there is a dearth of literature on the effort needed for effective opt-out recruitment.
In this paper we describe opt-out recruitment procedures for two studies on access to health care within the U.S. Department of Veterans Affairs. We report resource requirements for recruitment efforts (number of opt-out packets mailed and number of phone calls made). We also compare the characteristics of study participants to potential participants via t-tests, Fisher’s exact tests, and chi-squared tests.
Recruitment rates for our two studies were 12 and 21%, respectively. Across multiple study sites, we had to send between 4.3 and 9.2 opt-out packets to recruit one participant. The number of phone calls required to arrive at a final status for each potentially eligible Veteran (i.e. study participation or the termination of recruitment efforts) were 2.9 and 6.1 in the two studies, respectively. Study participants differed as expected from the population of potentially eligible Veterans based on planned oversampling of certain subpopulations. The final samples of participants did not differ statistically from those who were mailed opt-out packets, with one exception: in one of our two studies, participants had higher rates of mental health service use in the past year than did those mailed opt-out packets (64 vs. 47%).
Our results emphasize the practicality of using opt-out methods for studies of access to health care. Despite the benefits of these methods, opt-out alone may be insufficient to eliminate non-response bias on key variables. Researchers will need to balance considerations of sample representativeness and feasibility when designing studies investigating access to care.
Participant recruitment is a significant challenge in health services and health-related research [1, 2]. It may be particularly challenging when a representative sample is desired, as is often the case in survey research, or when constructing purposeful samples with specific characteristics, as is often the case in qualitative research. Timely recruitment of such samples can be especially difficult for studies of access to health care because, even among those who are currently in care, people who use services least often also may be the hardest to contact and recruit. Some of the most common recruitment methods, like convenience sampling via self- or clinician-referral, may be least appropriate for such studies, as these sampling strategies are likely to draw primarily on frequent service users. This is problematic because data from this population may not capture the full range of challenges to access experienced by individuals who rarely use services.
Many recruitment methods are commonly used in health services research. These include—but are not limited to—snowball sampling, in which participants are asked to name other potential participants [3, 4]; and respondent-driven sampling, which combines snowball sampling with statistical weighting to account for non-random sampling [5–7]. Other studies have used advertisements in the form of postings, online ads, radio, or TV; or direct outreach, for example “cold-calling” potential participants . Both opt-in and opt-out methods have also been used. Opt-in methods require potential participants to respond to an initial mailing before being contacted directly by study staff, while opt-out methods involve sending potential participants an informational packet about the study, giving them an opportunity to opt out (i.e., decline participation), and then contacting them only if they have not opted out within a predetermined period .
One key research component helps determine which of the above methods may be appropriate for any given study—namely, whether or not the study aims to draw from a pre-defined list of potential participants (a “sampling frame”). Such a sampling frame may, for example, be compiled from a healthcare system’s electronic medical record, or a health plan’s database of users. In such situations, direct outreach, opt-in and opt-out strategies are all technically feasible. However, following adoption of privacy legislation like the Health Insurance Portability and Accountability Act (HIPAA) in the United States , and the Data Protection Act in the United Kingdom , institutional review boards (IRBs) may restrict or prohibit use of opt-out methods in health-related research. Subsequent research has consistently demonstrated, however, that opt-out methods result in substantially higher recruitment rates than opt-in methods, and generate more representative samples [2, 11–19]. In this context, researchers often prefer opt-out methods, at least for low-risk studies [20, 21].
Published findings on the superiority of opt-out methods for recruitment are consistent with psychological and behavioral theories of motivation [22, 23], especially when opt-out methods are being compared to opt-in approaches. Opt-out methods may allow for more robust and unbiased recruitment by placing the burden for outreach on the research team rather than on potential participants or clinicians. Opt-in methods, in contrast, require potential participants to take the initiative by reaching out to study staff after receiving an initial mailing. That is something only individuals who are highly interested in the study are likely to do. In contrast, when opt-out approaches are used, only those potential participants who are highly disinterested in the study are likely to respond to the initial mailing to decline further contact (and thereby, study participation). Opt-out methods allow research staff to directly contact individuals who may be willing to participate, but not willing to take the initiative required for opt-in methods, maximizing chances of recruiting and obtaining information from this generally underrepresented subgroup of potential research participants. At the same time, by providing an opportunity for potential participants to decline further contact, opt-out methods may address concerns regarding intrusiveness and coercion.
Much of the existing literature on opt-out methods has been focused on clinical observational studies (e.g. ) or epidemiologic studies (e.g. ). Few studies described recruitment to health service-oriented studies (e.g. ), and none of the studies we located provided comprehensive data on the recruitment efforts and resources needed to obtain samples from administrative databases. In this paper, we therefore seek to enhance the literature base informing researchers on use of opt-out strategies by describing our experiences using these methods in two studies of access to mental health care among U.S. military Veterans (one featuring mixed quantitative and qualitative methods, and the other focusing on a quantitative telephone survey). In practical terms, we aimed to (1) compare our participant sample to the larger pool of potential participants mailed an opt-out letter, and (2) provide practical details on the resource demands (time and cost) required to apply our opt-out procedures. Researchers designing, budgeting, and executing these types of studies should be able to use our results to assist them in making more realistic assessments of sampling outcomes. Based on the literature reviewed above, our a priori hypothesis was that, across both studies described herein, the sample of study participants would be statistically similar to the larger pool of potential participants who were mailed an opt-out packet.
The sampling frames for our two studies (referred to below as “Access” and “Tailoring”) were identified through administrative data from the U.S. Department of Veterans Affairs (VA) Corporate Data Warehouse (CDW), the national clinical administrative database that consolidates data from the VA electronic medical record. Data were collected between 2014 and 2016.
Study 1: Access
This study, referred to as “Access,” was the initial component of a multi-site, mixed methods study designed to develop a self-report measure of perceived access to mental health care for Veterans. Participants were recruited to take part in a semi-structured qualitative interview, results from which informed the composition of a quantitative measure to assess perceived access to mental health services.
To be eligible to participate in this study, participants needed to be U.S. military Veterans between the ages of 18 and 70 years. They needed to have had at least one positive screen for PTSD, alcohol use disorder, or depression documented in their VA medical record in the previous year. We excluded Veterans with documented psychosis or dementia due to concerns that these conditions could limit the recall and cognitive function necessary to complete a meaningful qualitative interview. To ensure geographic diversity within our sample, we recruited participants from three separate clinics within each of three Veterans Integrated Service Networks (VISNs; VISN 1 in the Northeast, VISN 16 in the Central South, and VISN 21 in the West). The total population of Veterans meeting inclusion criteria for the study was 3,461.
Using CDW-generated lists of potentially eligible Veterans, we mailed successive waves of opt-out packets, in batches of 30–60 packets each, to Veterans within each of the three study VISNs. Successive waves of opt-out packets for each study VISN were not mailed simultaneously, but were instead mailed when study staff were nearing the end of the previous wave’s call list for that VISN. Opt-out packets included a letter briefly describing the study, and stating that recipients could receive a call from the study team unless they opted-out by returning an enclosed, self-addressed and stamped response form. See Additional file 1 for the text of the opt-out letter. Potential participants also were able to opt-out by calling a toll-free phone number included in the mailing. Those who did not opt out within two weeks were called by trained study staff to discuss study participation. Our study protocol allowed us to leave up to three unanswered messages (or make up to five calls when it was not possible to leave a message) before outreach efforts ceased. When eligible Veterans called back and left messages for study staff indicating interest in the study, the protocol allowed staff to make additional calls beyond the limits mentioned above. Once study staff reached potential participants, they explained the study in detail, assessed interest, and confirmed eligibility.
Recipients for each wave of opt-out packets were selected to ensure enrollment from specific sub-populations in each VISN: women, members of racial/ethnic minorities, a balance of rural and urban residents, and a balance of Veterans with and without a history of mental health service use. For example, some waves of opt-out packets were mailed exclusively to female Veterans to attain our target of 25% female participants in the final sample (while the population served by VA is about 10% female). Some waves were mailed exclusively to Veterans below age 40 to attain our target of 50% Veterans below age 40. In total, we mailed 230 opt-out packets to Veterans in VISN 1 over six waves, 140 packets to Veterans in VISN 16 over three waves, and 215 packets to Veterans in VISN 21 over six waves. Our target sample size was 90 Veterans (30 per VISN), but our protocol allowed recruitment efforts to cease if saturation was achieved—that is, if additional interviews added no new information or topics .
Institutional review board (IRB) issues
As a multi-site study, Access sought approval from the VA Central IRB. Obtaining Central IRB approval for the opt-out recruitment procedures required several steps beyond those we have encountered when using more traditional, clinic-based recruitment methods. First, a Waiver of Health Insurance Portability and Accountability Act (HIPAA) Authorization was obtained to allow the research team to access CDW data needed to identify potentially eligible Veterans and obtain their contact information. Second, because “cold calling” Veterans is not allowed under VA research guidelines, we specified that we would only call potential participants once they had had a minimum of two weeks to return the opt-out response form. Third, to maximize recruitment rates (including recruitment of as diverse a sample across all of our sub-population strata as possible), we wanted to offer potential participants the option of completing the study entirely by phone without an in-person visit . This required obtaining a Waiver of Written Informed Consent to allow us to elicit informed consent verbally over the phone, rather than obtaining written informed consent by mail. The latter is a time-consuming and burdensome process for participants and researchers alike; paradoxically, it also increases the risk for breach of confidentiality due to the possibility that documents containing personal health information may be lost in the mail. Telephone interviews also required that the Waiver of HIPAA Authorization (referenced above) refer not only to access CDW data for recruitment purposes, but also to specifically allow telephone interviews without a signed HIPAA Authorization.
While the focus of this manuscript is on the utility of opt-out recruitment methods in applied research settings, we include brief descriptions of study elements to provide context, especially with respect to what potential participants were asked to do. Participants in Access completed a set of tasks during a 1.5 - 2 h session (primarily face-to-face, although sessions for six participants were conducted by telephone). All sessions included obtaining written (for in-person interactions) or verbal (for phone interactions) informed consent and completing a semi-structured qualitative interview regarding factors that can affect access to mental health services by Veterans. All participants also completed a packet of self-report questionnaires (via pen-and-paper for in-person sessions, and verbally for phone sessions). Lastly, each session also involved structured discussions of barriers to seeking mental health care.
Research assistant call logs
In addition to the formal data collection described above, research assistants for Access kept call logs with brief notes regarding the reasons provided by potential participants for their interest or disinterest in study participation.
Study 2: Tailoring
The second study (referred to as “Tailoring”) was the quantitative telephone survey component of a mixed methods study designed to assess the influence of rural/urban differences in attitudes toward mental health care on rates of mental health service use. Participants for Tailoring completed an interviewer-administered, structured questionnaire.
This study also enrolled U.S. military Veterans aged 18–70 years. To be eligible, Veterans also had to have had at least one CDW-documented positive screen for depression or PTSD between October 2009 and September 2012, and reside in VISN 1 (Northeast), 16 (South Central region), 19 (Rocky Mountain region), or 23 (North Central region). Veterans with a diagnosis of dementia or a psychotic disorder were excluded from the sampling frame. The total number of Veterans meeting inclusion criteria for the study was 70,119.
Using the contact data obtained from CDW, we mailed successive waves of opt-out packets to eligible Veterans (approximately 100 every 2 weeks). As with Access, successive waves of opt-out packets for each study VISN were not necessarily mailed simultaneously, but were instead mailed when study staff were nearing the end of the previous wave’s call list for that VISN. Opt-out packets included a letter explaining the study and informing recipients that they would receive a call from study personnel unless they opted out, either by calling the indicated toll-free number or by mailing back the opt-out response form in the stamped, addressed envelope included in the packet. The packet also included blank copies of informed consent and HIPAA authorization forms for potential participants’ consideration. See Additional file 2 for the text of the opt-out letter for Tailoring.
We used stratified sampling to select Veterans from the sampling frame to receive recruitment letters. We aimed to recruit Veterans from a variety of distances from the Veteran’s residence to the nearest VAMC (evenly distributed among ≤15 miles, 16–49 miles, ≥50 miles). We aimed to have 25% of our final sample be female (while the population served by VA is about 10% female). As with Access, we determined the recipients of later waves of opt-out packets based on these target numbers (e.g. by sending later waves to only female Veterans, or only Veterans living a certain distance from their nearest VAMC, as needed).
Two weeks after the mailing date, trained interviewers began calling those Veterans who had not opted-out. Up to nine calls (three during working hours, three during weekday evening hours, three on weekends) were made over at least a two-week period to each telephone number available in CDW. As with the Access study, additional calls could be made to potential participants who left a voicemail message for study staff indicating interest. Once an interviewer reached a potential participant, the interviewer described the study purpose and procedures in detail, assessed interest and, if appropriate, verified the Veteran’s eligibility. A total of 3,703 opt-out packets were mailed to Veterans in 42 waves (a total of 849–1,041 packets per VISN), with a target sample size of 750 participants.
Institutional review board issues
Tailoring obtained a Waiver of Informed Consent and Waiver of HIPAA Authorization from the Central Arkansas Veterans Healthcare System IRB to permit identification of eligible Veterans from CDW data and to obtain their contact information (name, address[es] and telephone number[s]). Tailoring also received a Waiver of Written Informed Consent for telephone survey participation. Trained interviewers verbally reviewed a HIPAA Authorization form and the elements of informed consent with participants prior to starting the telephone survey. The review of the elements of informed consent culminated in study staff obtaining verbal informed consent from all participants.
Following informed consent, interviewers administered a structured questionnaire that covered individual characteristics, individual attitudes and beliefs about mental health care, and patterns and pathways of mental health service use (if any).
Analyses were performed for each study separately. We used independent samples t-tests, Fisher’s exact tests, and chi-squared tests to compare the characteristics of participants in a study to those of (a) all Veterans who met the study’s inclusion criteria, (b) all Veterans who were mailed an opt-out packet for the study, and (c) all Veterans mailed an opt-out packet, excluding those whose packet was marked as “return to sender” by the postal service due to an incorrect address. The variables on which these groups were compared included age, gender, race/ethnicity, past history of mental health service use, and rurality, all drawn from administrative data in the CDW. To examine the resource demands of opt-out approaches, we also calculated the average number of opt-out packets and the average number of calls required to obtain one study participant. Substantive findings from the Access and Tailoring studies will be reported elsewhere.
Figures 1 and 2 are recruitment flow diagrams for the Access and Tailoring studies. Relatively few potential participants who received opt-out packets chose to opt out of the study prior to phone contact from staff: 15.2% (Access) and 10.6% (Tailoring). An additional 27.7% (Access) and 29.2% (Tailoring) of potential participants could not be reached due either to incorrect address information (in which case attempting phone contact would have involved “cold calling” because study packets had not been delivered) or unsuccessful phone contact despite multiple attempts. Among those potential participants who could be reached by phone, 27.9% (Access) and 37.8% (Tailoring) agreed to participate.
Comparisons of the participant samples
Table 1 shows key demographics for all potential Access study participants (n = 3,461), for those who were mailed opt-out packets (n = 585), for those who received opt-out packets (i.e., packets were not returned to sender; n = 557), and for the final sample of Veterans who participated after undergoing the opt-out recruitment process (n = 72). Table 2 presents corresponding information for the Tailoring study (n’s of 70,119; 3,703; 3,533 and 762, respectively).
While Veterans who participated in the Access study differed from the overall population of eligible Veterans on many variables, this was expected because we had intentionally oversampled specific groups (e.g. female Veterans, racial/ethnic minority Veterans). Data in Table 1 indicate that the final sample of participants from the Access study was did not differ significantly from those who were mailed opt-out packets on age, gender, race, Hispanic ethnicity or residence (rural/urban). They did differ significantly on use of mental health care: participants were more likely to have used mental health services than the pool of Veterans receiving opt-out packets (64 vs. 47%, p = .01). The only statistically significant differences in any of the comparisons for the Tailoring study (Table 2) reflect the purposeful oversampling of women.
Resource demands — packets sent per participant recruited
As shown in Figs. 1 and 2, a minority of potential participants who were sent opt-out invitation packets ultimately participated in the studies: 12.3% of potential Access participants (72/585) and 20.6% of potential Tailoring participants (762/3,703). A small proportion of that is due to packets having been sent out but not fully followed-up because recruitment efforts ceased once enrollment goals were achieved (76 packets [13%] for Access; 214 packets [5.8%] for Tailoring). For Access, the number of opt-out packets sent per participant recruited ranged from 6.7 to 9.2 packets per participant across its three study VISNs (chi squared = 1.47, df = 2, p = .48). For Tailoring, they ranged from 4.3 to 5.7 packets per participant across four study VISNs (chi squared = 6.11, df = 3, p = .11).
The opt-out packets for Access weighed 1.7 oz, while those for Tailoring weighed 2.3 oz. Postage costs per opt-out packet for Access were approximately $0.90 U.S. (one stamp for the mailing itself, and one additional stamp inside on the self-addressed opt-out card so that respondents could opt out free of charge). Those for the heavier Tailoring packet were approximately $1.35 U.S. (two stamps for the packet itself, and one for the enclosed self-addressed opt-out card).
Resource demands — calls required to recruit one participant
On average, 2.9 ± 1.9 (SD) phone calls were needed to arrive at a final status (i.e., interview scheduled, Veteran declined to participate, or study staff abandoned unsuccessful outreach efforts) for Veterans we attempted to contact for the Access study. Veterans who participated in the study averaged more phone calls from study staff (4.5 ± 1.8), due primarily to the additional calls needed to schedule the study session itself after initial contact was made. For Tailoring, on average, 6.1 ± 5.1 calls were needed per Veteran to arrive at a final status; an average of 4.7 ± 3.7 calls were needed for the subset of Veterans who participated in the study. The larger number of calls required overall in Tailoring is a result of the more intensive calling procedures followed in that study (i.e., up to nine calls to each of the Veteran’s available telephone numbers) before efforts to contact the potential participant were suspended).
Study staff estimated that calls to potential participants resulting in no answer or a voicemail took, on average, 1 minute each. Calls that involved some discussion with potential participants, but did not result in consent, took about 5 minutes on average across both studies. Calls that involved consent to participate in the study took about 15 minutes for Access, and about 7 minutes for Tailoring. The extra time needed for consent calls for Access compared to Tailoring was due, in part, to the need to coordinate in-person participation for Access (including describing the physical address and check-in procedures for the clinic). Based on these estimates, it took about 40 h of phone time to recruit 72 Access participants, and about 500 h to recruit 762 Tailoring participants. These estimates include only time on the telephone itself, but not other tasks associated with calling (e.g. looking up phone numbers or documenting phone contact).
Research assistant call logs
Informal notes taken by research assistants during the recruitment process of the Access study indicated two broad trends related to study interest and participation. First, these notes indicated that many participants endorsed either very good or very bad experiences using VA treatment. Second, several Veterans noted that the potential to improve healthcare delivery to other Veterans was a strong motivator for them to participate in the study.
In this paper, we describe successful recruitment efforts using opt-out methods drawing potential samples from administrative databases in two studies of access to mental health care among Veterans.
Both the Access and Tailoring studies achieved their target sample sizes. Studies with a prominent qualitative component, like Access, continue enrollment until saturation is achieved, i.e., until additional interviews add no new information and no new topics emerge . Thus, when saturation had been achieved with 80 participants (72 of whom had been recruited via opt-out procedures), Access suspended recruitment. Tailoring interviewers enrolled 762 Veterans, exceeding its target of 750. Enrollment (as a percentage of opt-out letters sent) differed by study, with about 12% of opt-out letters for Access leading to research participation and nearly double that rate (21%) for Tailoring. This difference in recruitment rates was likely driven, at least in part, by differences in the nature of participation and sampling in the two studies. Most participants in the Access study needed to present to a VA clinic for a face-to-face research session, while participation in Tailoring took place completely by telephone: the former places a substantially greater burden on potential participants. Another likely explanation for this finding is that Tailoring interviewers were required to make more calls to each available telephone number before suspending efforts in this study than was the case for Access interviewers.
Neither Access nor Tailoring included an opt-in comparison arm, as both studies were funded to study Veterans’ experiences with mental health care rather than recruitment methods per se. Unfortunately, opt-in studies of military Veterans to which to compare our results are scarce, and the few we could find focused on biobank research (e.g. ) or involved enriched samples that had already participated in previous studies (e.g. ). Opt-out response rates for the Access and Tailoring studies can also be compared to published rates from national surveys that recruited via random digit dialing surveys. Past surveys addressed sensitive topics such as sexual violence , alcohol use and gambling , racial discrimination and workplace harassment by gender , pharmacogenetic test result preferences , and care coordination for the chronically ill . Response rates for these studies varied dramatically, from 27.6%  to 65.4% . Variation in response rates may reflect differences in the sensitivity of the topic covered by the survey, the length of the telephone survey commitment, and the intensity with which the protocol allowed recruiters to pursue responses. For example, one study on workplace harassment study allowed twenty contact attempts, and two calls from interviewers for refusal conversion in the case of a refusal . Regardless of response rates, for ethical reasons, the VA does not permit investigators to use cold calling strategies like random digit dialing.
While the recent literature on non-Veteran samples consistently reports higher rates of recruitment using opt-out rather than opt-in methods, [2, 11–18, 32–36] actual recruitment rates achieved with opt-out methods also varied widely, from lows of about 15% [32, 33] to highs above 70% [35, 36]. Some of the variation undoubtedly arises from differences in what participants are being asked to do. For example, the highest recruitment rate (83%) was reported by Vellinga and colleagues  who simply asked patients in Irish general practices for permission to review their medical records as part of a study of urinary tract infections. In this case, completing the study required no effort from participants other than providing consent. Studies focused on prevention or treatment of specific clinical conditions such as angina  or colorectal cancer  from which participants might reasonably expect clinical benefit also reported recruitment rates at the higher end of the range (50 and 67%, respectively). On the other hand, observational studies with a non-clinical focus such as quality of end-of-life care  or road safety research  reported lower recruitment rates (40 and 17%, respectively).
Given the non-clinical focus of the Access and Tailoring studies, it is not surprising that our recruitment rates of 12 and 21% were at the lower end of rates reported for opt-out recruitment. Further, both studies required 1.5–2 h of active participation in-person or by phone, putting a heavier demand on participants than was the case for many of the studies found in the literature.
Comparisons of the participant samples
Both Access and Tailoring were generally successful in recruiting the samples they had targeted. The Access study purposefully oversampled women and racial/ethnic minorities, and endeavored to recruit 50% rural residents. The oversampling of female Veterans (who are younger, on average, than male Veterans) contributed to a disproportionately younger sample compared to the overall population of potentially eligible Veterans. The participant sample for Access also had a significantly higher rate of past-year mental health service utilization than did the population of Veterans who were mailed opt-out packets (64 vs. 47%). This pattern was seen in Tailoring as well, although the difference was smaller and not statistically significant. It is plausible that several factors associated with non-participation in mental health treatment might also be associated with non-participation in mental health research, including physical disability, transportation barriers, paranoia, or other mental health symptoms. Thus, findings from Access and Tailoring are generally consistent with previous reports that opt-out methods generate samples with characteristics that reasonably (but not completely) mirror those in their reference populations [11, 13, 36].
Among the strengths of this effort is the practical information it affords regarding the resource demands of opt-out recruitment. For Access, it took between 6.7 and 9.2 mailed opt-out packets to recruit one study participant across the study’s three VISNs. For Tailoring, it took between 4.3 and 5.7 packets to recruit one participant across four study VISNs. The differences among VISN’s did not reach statistical significance in either study.
The average number of phone calls required to reach a final status for each participant ranged from about 3–6 for our two studies. For Access, arriving at a final status required more calls for eventual participants, due to the additional calls required to schedule in-person study participation. In contrast, for Tailoring, non-participants required more calls before final status was achieved; this is likely to reflect the larger number of calls that were allowed under the Tailoring protocol. Across the two studies, it took about 30–40 min of phone time to successfully recruit one participant. Taken together, these findings emphasize the critical importance of seemingly small design decisions (such as the maximum number of calls allowed to potential participants before outreach efforts cease) in determining how much effort is spent on different recruitment activities. It is our hope that other researchers will find the information on recruitment design decisions for Access and Tailoring (and the corresponding resource demands and recruitment outcomes) useful in informing the design of their future studies.
Possible reasons for participation
While neither Access nor Tailoring explicitly queried Veterans regarding their reasons for participating, informal comments from Access participants suggested two contributing factors. First, several Veterans said that they were excited to participate because they had either had very good or very bad experiences accessing VA mental health care, and this study was an opportunity to have their experience be heard. This is consistent with previous literature that has found personal salience to be a key factor contributing to study participation [24, 37]. Second, other Veterans noted that participating was a way for them to “give back” to VA and possibly improve the mental health services that their fellow Veterans might need. This strong orientation toward service and community participation has also been the focus of previous research on study participation [24, 38, 39]. Such altruism may be especially strong in a Veteran population with its sense of camaraderie, but may not generalize to other populations.
Several limitations should be taken into account when considering study findings, especially for other researchers who are considering using similar methods in their own studies. First, there are several challenges to the generalizability of findings to other populations and settings. Both studies recruited Veterans already in touch with the VA health care system, and both depended on administrative data to generate a sampling frame. Response rates and resource demands could be higher for studies attempting to recruit Veterans eligible for but not using VA care. It would be reasonable to expect that such Veterans would be even less likely to participate in a study on access to mental health services than those who have received at least some VA health care. In addition, our results may be most applicable to other large health systems with administrative databases similar to those in the VA (e.g. Kaiser Permanente). Second, neither Access nor Tailoring included formal evaluations of why Veterans chose to participate or not participate in the study—although we do report brief findings from informal call notes. Third, our studies were not designed to compare recruitment rates or resource demands across recruitment strategies. Thus, while our results will be useful to others in designing, budgeting, and executing studies employing opt-out recruitment strategies, they do not shed light on the relative utility of opt-out versus other strategies.
Studies examining access to care must focus carefully on recruitment and design issues, lest researchers unwittingly fail to sample those people who never or rarely access care. The two studies described in this paper used opt-out strategies to achieve ultimate recruitment rates of 12 and 21% of those sent an opt-out packet (for in-person and telephone-based participation, respectively). While the final samples of participants were broadly similar to those who were mailed opt-out packets, in the Access study, opt-out alone was not sufficient to eliminate non-response bias reflected in higher rates of mental health service use among study participants. The sampling frame generated for Tailoring, which took history of service use into account, was able to reduce that problem substantially. This suggests that future studies of mental health access may want to focus outreach efforts specifically toward potential participants who have not used the services in question and, when sampling frames can be constructed, use sampling strategies that increase the likelihood of recruiting such participants.
Our opt-out recruitment efforts required a relatively high number of calls per potential participant. Had we used opt-in methods instead, it is plausible that recruitment would have required fewer calls, but at the expense of sending many more recruitment packages to potential participants. Researchers will need to balance these considerations when designing studies investigating access to care that rely on administrative databases for potential selection of participants.
Corporate Data Warehouse
Health Insurance Portability and Accountability Act
Institutional review board
Posttraumatic stress disorder
Department of Veterans Affairs
Veterans Integrated Service Network
Johnson LC, Beaton R, Murphy S, Pike K. Sampling bias and other methodological threats to the validity of health survey research. Int J Stress Mgmt. 2000;7(4):247–67.
Treweek S, Pitkethly M, Cook J, Kjeldstrom M, Taskila T, Johansen M, Mitchell E. Strategies to improve recruitment to randomised controlled trials. Cochrane Database Syst Rev. 2011;10-.[MR000013]. doi:10.1002/14651858.MR000013.pub5.
Sadler GR, Lee HC, Lim RSH, Fullerton J. Recruitment of hard‐to‐reach population subgroups via adaptations of the snowball sampling strategy. Nurs Health Sci. 2010;12(3):369–74.
Atkinson R, Flint J. Accessing hidden and hard-to-reach populations: snowball research strategies. Soc Res Update. 2001;33(1):1–4.
Salganik MJ. Variance estimation, design effects, and sample size calculations for respondent-driven sampling. J Urban Health. 2006;83(1):98–112.
Heckathorn DD. Respondent-driven sampling: a new approach to the study of hidden populations. Soc Probl. 1997;44:174–99.
Heckathorn DD. Respondent-driven sampling II: deriving valid population estimates from chain-referral samples of hidden populations. Soc Probl. 2002;49(1):11–34.
Funkhouser E, Macaluso M, Wang X. Alternative strategies for selecting population controls: comparison of random digit dialing and targeted telephone calls. Ann Epidemiol. 2000;10(1):59–67.
Annas GJ. HIPAA regulations-a new era of medical-record privacy? N Engl J Med. 2003;348(15):1486–90.
Redsell SA, Cheater FM. The data protection act (1998): implications for health researchers. J Adv Nurs. 2001;35(4):508–13.
Junghans C, Feder G, Hemingway H, Timmis A, Jones M. Recruiting patients to medical research: double blind randomised trial of “opt-in” versus “opt-out” strategies. BMJ. 2005;331(7522):940.
Hunt KJ, Shlomo N, Addington-Hall J. Participant recruitment in sensitive surveys: a comparative trial of ‘opt in’versus ‘opt out’approaches. BMC Med Res Methodol. 2013;13(1):3.
Krousel-Wood M, Muntner P, Jannu A, Hyre A, Breault J. Does waiver of written informed consent from the institutional review board affect response rate in a Low‐risk research study? J Investig Med. 2006;54(4):174–9.
Kaufman D, Bollinger J, Dvoskin R, Scott J. Preferences for opt-in and opt-out enrollment and consent models in biobank research: a national survey of Veterans Administration patients. Genet Med. 2012;14(9):787–94.
Schroy PC, Glick JT, Robinson P, Lydotes MA, Heeren TC, Prout M, Davidson P, Wong JB. A cost-effectiveness analysis of subject recruitment strategies in the HIPAA era: results from a colorectal cancer screening adherence trial. Clin Trials. 2009;6(6):597–609.
Bray I, Noble S, Boyd A, Brown L, Hayes P, Malcolm J, Robinson R, Williams R, Burston K, Macleod J. A randomised controlled trial comparing opt-in and opt-out home visits for tracing lost participants in a prospective birth cohort study. BMC Med Res Methodol. 2015;15(1):52.
Boland J, Currow DC, Wilcock A, Tieman J, Hussain JA, Pitsillides C, Abernethy AP, Johnson MJ. A systematic review of strategies used to increase recruitment of people with cancer or organ failure into clinical trials: implications for palliative care research. J Pain Symptom Manage. 2015;49(4):762–72. e765.
Parry O, Bancroft A, Gnich W, Amos A. Nobody home? issues of respondent recruitment in areas of deprivation. Critical Public Health. 2001;11(4):305–17.
Nattinger AB, Pezzin LE, Sparapani RA, Neuner JM, King TK, Laud PW. Heightened attention to medical privacy: challenges for unbiased sample recruitment and a possible solution. Am J Epidemiol. 2010;172(6):637–44.
Hewison J, Haines A. Confidentiality and consent in medical research: overcoming barriers to recruitment in health research. BMJ. 2006;333(7562):300.
Fischer EP, Kelly AP, Perry D, Cheney A, Drummond K, Han X, Hudson TJ, et al. Making first contact with hard-to-reach study populations. HSR&D Research Briefs. 2012. https://www.hsrd.research.va.gov/publications/research_briefs/default.cfm?Issue=04/01/2012&Copy=publications/research_briefs/ResBrfOnline_0312.cfm.
Kahneman D, Tversky A. Prospect theory: An analysis of decision under risk. Econometrica: J Economet Soc. 1979;47(2):263–91.
Tversky A, Kahneman D. Loss aversion in riskless choice: A reference-dependent model. Q J Econ. 1991;1039–1061.
Galea S, Tracy M. Participation rates in epidemiologic studies. Ann Epidemiol. 2007;17(9):643–53.
Bernard HR, Ryan GW. Analyzing qualitative data: Systematic approaches. Washington: SAGE publications; 2009.
Fear NT, Van Staden L, Iversen A, Hall J, Wessely S. Fifty ways to trace your veteran: increasing response rates can be cheap and effective. Eur J Psychotraum. 2010;1:5516.
Breiding MJ. Prevalence and characteristics of sexual violence, stalking, and intimate partner violence victimization—national intimate partner and sexual violence survey, united states, 2011. Morb Mortal Wkly Rep Surveill Summ (Washington, DC: 2002). 2014;63(8):1.
Welte J, Barnes G, Wieczorek W, Tidwell M-C, Parker J. Alcohol and gambling pathology among US adults: prevalence, demographic patterns and comorbidity. J Stud Alcohol. 2001;62(5):706–12.
Shannon CA, Rospenda KM, Richman JA, Minich LM. Race, racial discrimination, and the risk of work-related illness, injury or assault: findings from a national study. J Occup Env Med. 2009;51(4):441.
Haga SB, Kawamoto K, Agans R, Ginsburg GS. Consideration of patient preferences and challenges in storage and access of pharmacogenetic test results. Genet Med. 2011;13(10):887.
Maeng DD, Martsolf GR, Scanlon DP, Christianson JB. Care coordination for the chronically ill: understanding the patient’s perspective. Health Serv Res. 2012;47(5):1960–79.
Elkington J, Stevenson M, Haworth N, Sharwood L. Using police crash databases for injury prevention research–a comparison of opt‐out and opt‐in approaches to study recruitment. Aust N Z J Public Health. 2014;38(3):286–9.
Sygna K, Johansen S, Ruland CM. Recruitment challenges in clinical research including cancer patients and caregivers. Trials. 2015;16(1):1.
Trevena L, Irwig L, Barratt A. Impact of privacy legislation on the number and characteristics of people who are recruited for research: a randomised controlled trial. J Med Ethics. 2006;32(8):473–7.
Goodman A, Gatward R. Who are we missing? area deprivation and survey participation. Eur J Epidemiol. 2008;23(6):379–87.
Vellinga A, Cormican M, Hanahoe B, Bennett K, Murphy AW. Opt-out as an acceptable method of obtaining consent in medical research: a short report. BMC Med Res Methodol. 2011;11(1):40.
Dunn KM, Jordan K, Lacey RJ, Shapley M, Jinks C. Patterns of consent in epidemiologic research: evidence from over 25,000 responders. Am J Epidemiol. 2004;159(11):1087–94.
O’Neil MJ. Estimating the nonresponse BiasDue to refusals in telephone surveys. Public Opin Q. 1979;43(2):218–32.
Newington L, Metcalfe A. Factors influencing recruitment to research: qualitative study of the experiences and perceptions of research teams. BMC Med Res Methodol. 2014;14(1):1.
The authors would like to acknowledge the contributions of Patricia Wright and Kara Zamora, both of whom have contributed to data collection for this manuscript.
Funding for this study was provided by the VA Health Services Research and Development “Collaborative Research to Enhance and Advance Transformation and Excellence” initiative (grant number CREATE CRE #12-300). The funding agency did not play any role in the study design, data collection, analysis, or interpretation of results, nor did it play any role in the writing of this manuscript. The views and opinions expressed in this manuscript are from the authors and do not necessarily reflect the views of the US Department of Veterans Affairs or of the United States government.
Availability of data and materials
The datasets generated and analyzed for this manuscript are not publicly available as the larger study for which these analyses were undertaken is still ongoing. The VA Corporate Data Warehouse (CDW) from which the list of potentially eligible Veterans was drawn is not publicly available as it is contains Personal Health Information on Veterans receiving treatment.
CJM took the lead role in conceptualizing and writing the manuscript, as well as interpreting the data; JFB and EPF assisted with study design and manuscript editing; DH conducted analyses; LB, JML, SE, CJK, RLS, and JMP provided critical intellectual content for the manuscript (including contributions to the interpretation of study data); and JMP took the lead role in designing the study. All authors have read and approved the final manuscript.
The authors declare that they have no competing interests.
Consent for publication
Ethics approval and consent to participate
The two studies described in this manuscript were approved by the VA Central IRB (Access study) and the Central Arkansas Veterans Healthcare System IRB (Tailoring study), respectively. The list of potentially eligible Veterans was originally drawn from the VA Corporate Data Warehouse (CDW), the national clinical administrative database that consolidates data from the VA electronic medical record. A Waiver of Health Insurance Portability and Accountability Act (HIPAA) Authorization was obtained to allow the research team to access CDW data needed to identify potentially eligible Veterans and obtain their contact information.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.