Skip to main content

Comparison of up-front cash cards and checks as incentives for participation in a clinician survey: a study within a trial



Evidence is needed regarding effective incentive strategies to increase clinician survey response rates. Cash cards are increasingly used as survey incentives; they are appealing because of their convenience and because in some cases their value can be reclaimed by investigators if not used. However, their effectiveness in clinician surveys is not known. In this study within the BRCA Founder OutReach (BFOR) study, a clinical trial of population-based BRCA1/2 mutation screening, we compared the use of upfront cash cards requiring email activation versus checks as clinician survey incentives.


Participants receiving BRCA1/2 testing in the BFOR study could elect to receive their results from their primary care provider (PCP, named by the patient) or from a geneticist associated with the study. In order to understand PCPs’ knowledge, attitudes, experiences and willingness to disclose results we mailed paper surveys to the first 501 primary care providers (PCPs) in New York, Boston, Los Angeles and Philadelphia who were nominated by study participants to disclose their BRCA1/2 mutation results obtained through the study. We used alternating assignment stratified by city to assign the first 303 clinicians to receive a $50 up-front incentive as a cash card (N = 155) or check (N = 148). The cash card required PCPs to send an activation email in order to be used. We compared response rates by incentive type, adjusting for PCP characteristics and study site.


In unadjusted analyses, PCPs who received checks were more likely to respond to the survey than those who received cash cards (54.1% versus 41.9%, p = 0.046); this remained true when we adjusted for provider characteristics (OR for checks 1.61, 95% CI 1.01, 2.59). No other clinician characteristics had a statistically significant association with response rates in adjusted analyses. When we included an interaction term for incentive type and city, the favorable impact of checks on response rates was evident only in Los Angeles and Philadelphia.


An up-front cash card incentive requiring email activation may be less effective in eliciting clinician responses than up-front checks. However, the benefit of checks for clinician response rates may depend on clinicians’ geographic location.

Trial registration (NCT03351803), November 24, 2017.

Peer Review reports


Surveying health care providers is an important means of obtaining information about medical practices and clinician knowledge and attitudes. However, clinician survey response rates in the United States have decreased gradually over time [1,2,3]. A 2013 meta-analysis described an approximately 20% decline in response rates over the preceding two decades [2]. The decline in response rates is thought to reflect increasing demands on clinicians’ time that limit participation in research activities [4]. Since low response rates can compromise study findings’ internal and external validity [5] and increase research costs, strategies to maximize clinician survey response rates are sorely needed.

The timing, type, and amount of monetary incentives provided to survey recipients are known to influence response rates [6]. A randomized study demonstrated higher clinician survey response rates with $50 versus $20 check incentives [6]. Timing of the incentive also impacts the likelihood of response, with up-front unconditional cash incentives yielding superior response rates compared with conditional cash incentives paid only after providers respond to the survey [7, 8] or lottery-based incentives [9]. Although cash cards and gift cards are increasingly used in survey research, little is known about their impact on clinician survey response rates. Cash cards have several potential advantages over cash or checks. Cash cards are increasingly used in day-to-day life, as people seek alternatives to cash or paper checks. In contrast to cash and similar to checks, some cash cards can be reclaimed by investigators if they are not used, although such cash cards require that unique cards or codes be assigned in advance to a specific survey recipient (i.e. registered) [10]. Because checks and registered cash cards can be tracked more easily than cash, they may be preferable to cash or non-registered cash cards for institutional accounting. Registered cash cards have the additional benefits of being logistically more feasible and efficient than checks, which must be generated individually for each clinician surveyed. Use of registered cash cards (hereafter called “cash cards”) for incentives have yielded adequate response rates in some studies [11, 12]. However, the impact of cash card incentives compared with other types of financial incentives on clinician survey response rates is not known.

We conducted a study comparing up-front, unconditional cash card survey incentives to check survey incentives to assess their impact on primary care provider (PCP) survey response rates. The BRCA Founder OutReach (BFOR) study is a clinical trial being conducted in New York, Boston, Philadelphia and Los Angeles examining the implementation of a digital platform and no-cost BRCA1/2 founder mutation testing for individuals of Ashkenazi Jewish descent [13]. Study participants elect to receive BRCA1/2 results from their PCP or a study-affiliated specialist. We surveyed PCPs elected by their patients to disclose results to determine PCPs’ knowledge, attitudes and experience with BRCA1/2 testing and their willingness to disclose their patient’s results. In this substudy of survey incentives, we assigned PCPs to receive an up-front cash card or an up-front check incentive.



We surveyed the first 125 PCPs from each city who were elected by a BFOR participant to share his or her BRCA1/2 results. Using a combination of questions derived from other surveys [14,15,16] and questions developed specifically for the BFOR project (Supplement), the survey gathered general demographic and practice information, assessment of BRCA1/2 mutation knowledge, PCPs’ opinions on incorporating genetic testing into their existing practices, and willingness to disclose the results of their patients’ testing obtained through the BFOR study. We mailed paper surveys, although we also provided PCPs the option to participate via the Internet. Each initial survey mailing included a personalized cover letter, an up-front, unconditional $50 incentive, a four-page survey designed to be completed in less than 10 min, and a pre-paid return envelope. Surveys were prelabeled with each PCP’s assigned study identification number to allow study staff to identify which PCPs had already responded; no other identifiers were included in PCPs’ responses. First and second reminders were sent via mail roughly three and 6 weeks, respectively, after the initial mailing. These reminders contained personalized letters, a second copy of the survey, and a pre-paid return envelope.

Incentive assignment and study sample

Whenever a patient requested that their PCP disclose their results, that PCP immediately became eligible for the survey study and was assigned by research staff to receive $50 cash card or $50 check incentive. (Fig. 1) Assignment was performed as follow: as PCPs were nominated by their patients, they were assigned a study ID based on their region. Within each region, we used an alternating 1:1 allocation strategy to assign newly enrolled PCPs to receive a cash card or check. The cash cards were reloadable debit cards that required activation by the study managers before use. PCPs received instructions accompanying the card informing them the if they wished to activate the cash card they had to email a study manager with their card number and request card activation. In June 2018, in an effort to increase response rates, we additionally decided to send out third reminders to PCPs. The third reminder included a personalized letter, third copy of the survey, pre-paid return envelope, and a second incentive equivalent to the first. In June 2018, in an effort to increase response rates, we additionally mailed third reminders to PCPs. The third reminder included a personalized letter, third copy of the survey, pre-paid return envelope, and a second incentive equivalent to the first. We planned to send third reminders to all PCPs who had not yet responded to the survey, but third reminders were subsequently discontinued due to study staffing limitations. The third reminders were sent to the first 42 nonresponding PCPs who had been assigned to receive cash cards (providing them with a second cash card); third reminders were also planned for nonresponding PCPs who had been assigned to receive checks. However, issuing checks took more time than issuing cash cards, and the third reminder initiative was terminated before any third-reminder letters with checks were actually mailed. Our target sample size for the PCP survey study was 500 (125 per region), based on a sample size calculation derived from anticipated differences in factors influencing PCPs’ willingness to disclose their patients’ BRCA1/2 results and a response rate of 50% derived from rates in other provider surveys [6, 17, 18]. After June 2018, due to overall response rates below 50% and early findings demonstrating that checks yielded higher response rates, the randomized study of incentives was stopped and all further survey mailings to newly enrolled PCPs included checks. This analysis includes the 303 PCPs enrolled in the randomized portion of the study between December 2017 and June 2018. Among these, 155 were assigned to receive their incentive in the form of a cash card and 148 were assigned to receive a check.

Fig. 1

Enrollment of PCPs in the study and incentive assignment

Outcome and PCP characteristics

Our pre-specified primary outcome was response to the survey; staff determining this outcome were not blinded to the exposure. Our primary independent variable was receipt of a cash card versus check. Covariates were clinician city, specialty and sex; these were the demographic and practice data that were available for both responding and non-responding PCPs.


We used univariate Chi-square tests to examine response rates according to whether PCPs received checks or cash cards and according to PCP sex, city and specialty. Because more female PCPs were assigned to checks than males, we used multivariate logistic regression to adjust analyses for demographic characteristics. We also conducted stratified analyses of the impact of checks versus cash cards according to demographic characteristics, and we noted that the impact of checks versus cash cards appeared to vary by city. To explore this further, we incorporated interaction terms into our model. Finally, we conducted sensitivity analyses in which we reclassified as non-responding the 4 PCPs who had responded to the survey only after receiving a third survey reminder letter, since only a subset of PCPs received this third reminder. Statistical analyses were performed using SAS version 9.3 (Copyright 2002–2012, Cary, North Carolina).

This study adhered to the TREND checklist for reporting of nonrandomized studies ( The BFOR study protocol is registered on (NCT03351803). The protocol for the survey incentives study described here was developed a priori subsequent to the overall BFOR study protocol and submitted to the institutional review board prior to participant enrollment.


Characteristics of surveyed PCPs are included in Table 1. Characteristics of PCPs who received checks versus cash cards did not differ by city, provider type, or specialty (p = 0.33, p = 0.23, and 0.09, respectively); however, a higher proportion of PCPs receiving checks were female (61.5% of check recipients versus 51.0% of cash card recipients; p = 0.03) (Table 1).

Table 1 Characteristics of primary care providers who received checks versus cash card incentives

Overall, 145 PCPs (47.9%) responded to the survey. Factors associated with survey response in unadjusted and adjusted analyses are shown in Table 2. In unadjusted analyses, survey response rates were higher among check recipients than cash card recipients (54.1% versus 41.9%, p = 0.046) and among female providers compared to males (53.5% versus 40.6%, p = 0.04). Advanced practice providers were more likely to respond than physicians (80.0% versus 46.8%, p = 0.04), although there were only 10 advanced practice providers surveyed. Response rates varied somewhat by city, with the highest response rates among providers from Boston (55.2%) and Philadelphia (48.1%), but differences across cities were not significant. Of the 42 PCPs who received third reminder letters with cash cards re-sent, only 4 (9.5%) responded to the survey. After adjustment for city, sex, provider type and specialty, PCPs receiving checks were more likely to respond to surveys than those receiving cash cards (OR 1.61 (95% CI 1.01, 2.59), p = 0.047; Table 2). No other provider characteristics were significantly associated with likelihood of response in the adjusted analyses. In the sensitivity analysis excluding the survey responses from the 4 PCPs who responded only after receiving third reminders, in multivariable logistic regression the association between receipt of a check and odds of survey response was stronger than in the main analysis (OR 1.82 (95% CI 1.13, 2.92)), p = 0.01) and advance practice providers had a greater adjusted odds of responding, though the confidence interval for the odds ratio was wide (OR 5.69 (95% CI 1.11, 29.11), p = 0.04).

Table 2 Unadjusted and adjusted response rates based on provider characteristics

Table 3 shows response rates among PCPs receiving checks versus cash cards stratified by city, and the results of the multivariable logistic regression model that included interaction terms for city and incentive type. The impact of incentive type on response rate varied notably by city (p = 0.02 using the Wald Chi Square test). In Boston and New York, the relationship between incentive type and survey response was not statistically significant. In Los Angeles and Philadelphia, checks were associated with statistically higher likelihood of survey response. In Los Angeles, 63.6% of those receiving checks responded to the survey versus 25.7% of those receiving cash cards (OR 4.73, 95% CI 1.64, 13.50), and in Philadelphia, 62.5% of those receiving checks responded versus 35.7% of those receiving cash cards (OR 3.61 (95% CI 1.11, 11.72). A sensitivity analysis removing the survey responses from the 4 PCPs who responded to the third reminders yielded similar results.

Table 3 Impact of providers’ city on the association between incentive type and likelihood of survey response


In an era of declining clinician survey response rates, understanding the most successful and cost-effective strategies to optimize response rates is important for maximizing studies' validity and feasibility. Evidence about the effectiveness of cash cards for clinician surveys is very limited. In this randomized study, the overall PCP response rate was less than our goal of at least 50%, underscoring the persistent challenge of eliciting provider responses. However, among PCPs receiving checks, the overall response rate was 54.1%, compared to 41.9% among those receiving cash cards. The benefit of check incentives persisted when we adjusted for provider characteristics, suggesting that an upfront cash card incentive requiring email activation may be less effective in eliciting provider responses than up-front checks. However, the benefit of checks appears to be regionally specific: checks were associated with increased response rates in Los Angeles and Philadelphia, but not in Boston and New York. Another notable finding from our study was that advance practice providers such as nurse practitioners were more likely to respond to the survey, though the number of advance practice providers in our study was small and these findings need to be confirmed by additional research.

There are several potential explanations for our finding of a benefit of check incentives. PCPs may be more familiar with checks and feel that they are more straightforward to deposit and thus use. The need to email a study manager to activate the cash card may have also limited enthusiasm for this type of incentive. The regional differences may suggest that behaviors regarding cash cards and familiarity with them varies geographically. However, it is also possible that factors not related to incentive type contributed to our findings about differences between cities. For example, overall survey response rates were the highest in Boston, likely at least partly because the principal investigators for the PCP survey component of BFOR (who signed the survey cover letter) were Boston-based investigators. This difference may have attenuated some of the differential impact of checks versus cash cards in Boston, although it seems unlikely to fully explain the regional differences seen. Our findings suggest that investigators conducting local or regional surveys should consider local context when they choose survey incentives. For national studies, checks (or cash) may be a safer option to maximize survey responses.

Strengths of this study include its prospective enrollment of PCPs from 4 different cities. It also has some limitations. First, we used an alternating assignment strategy rather than randomization to assign providers to check versus cash card as providers were nominated by their patients to disclose results. We are not aware of any ways that this would have biased our findings in this unblinded study, however, and we used multivariable logistic regression to balance known confounders. Second, we had relatively limited covariates for non-responding PCPs, which limited the comparisons of responding and non-responding providers. Third, 42 PCPs, all in the cash card arm, received an intensified survey reminder approach, with a third reminder mailing enclosing a second incentive. Our sensitivity analyses demonstrated that this biased our findings somewhat towards the null. Fourth, we did not enroll as many PCPs as planned into this study of incentives because of our decision to use checks only for all PCPs after the 303rd PCP enrolled, in order to maximize the response rate and the corresponding robustness of our findings from the survey (which we will report separately). Lastly, our findings may not be generalizable to PCPs practicing outside major cities, or to other cities in the U.S. Nonetheless, we believe that these findings provide valuable information for researchers who are considering what types of incentives to use for provider surveys.


Monetary incentives in the form of up-front checks may increase clinician survey response rates more than up-front cash card incentives. However, the differential impact of these incentives appears to be region-specific. Further research is needed to explore these differences. In addition, further research on cash card incentives for clinician surveys should explore whether not requiring email activation increases the effectiveness of the incentive, despite the possibility of added cost. Clinician surveys remain critical for understanding health care service delivery, and continued investigation is needed to identify the most effective and cost-effective strategies to optimize clinician survey response rates.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.



BReast CAncer gene


Primary Care Provider


BRCA Founder OutReach


  1. 1.

    Brtnikova M, Crane LA, Allison MA, Hurley LP, Beaty BL, Kempe A. A method for achieving high response rates in national surveys of U.S. primary care physicians. PLoS One. 2018;13(8):e0202755.

    Article  Google Scholar 

  2. 2.

    Cho YI, Johnson TP, Vangeest JB. Enhancing surveys of health care professionals: a meta-analysis of techniques to improve response. Eval Health Prof. 2013;36(3):382–407.

    Article  Google Scholar 

  3. 3.

    McLeod CC, Klabunde CN, Willis GB, Stark D. Health care provider surveys in the United States, 2000-2010: a review. Eval Health Prof. 2013;36(1):106–26.

    Article  Google Scholar 

  4. 4.

    Cook JV, Dickinson HO, Eccles MP. Response rates in postal surveys of healthcare professionals between 1996 and 2005: an observational study. BMC Health Serv Res. 2009;9:160.

    Article  Google Scholar 

  5. 5.

    Pit SW, Vo T, Pyakurel S. The effectiveness of recruitment strategies on general practitioner's survey response rates - a systematic review. BMC Med Res Methodol. 2014;6(14):76.

    Article  Google Scholar 

  6. 6.

    Keating NL, Zaslavsky AM, Goldstein J, West DW, Ayanian JZ. Randomized trial of $20 versus $50 incentives to increase physician survey response rates. Med Care. 2008;46(8):878–81.

    Article  Google Scholar 

  7. 7.

    Leung GM, Johnston JM, Saing H, Tin KY, Wong IO, Ho LM. Prepayment was superior to postpayment cash incentives in a randomized postal survey among physicians. J Clin Epidemiol. 2004;57(8):777–84.

    Article  Google Scholar 

  8. 8.

    Wiant K, Geisen E, Creel D, et al. Risks and rewards of using prepaid vs. postpaid incentive checks on a survey of physicians. BMC Med Res Methodol. 2018;18(1):104.

    Article  Google Scholar 

  9. 9.

    Halpern SD, Kohn R, Dornbrand-Lo A, Metkus T, Asch DA, Volpp KG. Lottery-based versus fixed incentives to increase clinicians' response to surveys. Health Serv Res. 2011;46(5):1663–74.

    Article  Google Scholar 

  10. 10.

    Chen JS, Sprague BL, Klabunde CN, et al. Take the money and run? Redemption of a gift card incentive in a clinician survey. BMC Med Res Methodol. 2016;16:25.

    Article  Google Scholar 

  11. 11.

    Van Otterloo J, Richards JL, Seib K, Weiss P, SB O. Gift card incentives and non-response bias in a survey of vaccine providers: the role of geographic and demographic factors. PLoS One. 2011;6(11):e28108.

    Article  Google Scholar 

  12. 12.

    Delnevo CD. Abatemarco D, Steinberg MB. Physician response rates to a mail survey by specialty and timing of incentive. Am J Prev Med. 2004;26(3):234–6.

    Article  Google Scholar 

  13. 13.

    Morgan K, Gabriel C, Symecko H, et al. Early Results from the BRCA Founder Outreach (BFOR) Study: Population Screening Using a Medical Model. Chicago: American Society of Clinical Oncology annual meeting; may 31–June 4, 2019; 2019.

    Google Scholar 

  14. 14.

    Wideroff L, Freedman AN, Olson L, et al. Physician use of genetic testing for Cancer susceptibility: results of a National Survey. Cancer Epidemiol Biomark Prev. 2003;12:295–303.

    Google Scholar 

  15. 15.

    Wideroff L, Vadaparampil ST, Greene MH, Taplin S, Olson L, Freedman AN. Hereditary breast/ovarian and colorectal cancer genetics knowledge in a national sample of US physicians. J Med Genet. 2005;42(10):749–55.

    CAS  Article  Google Scholar 

  16. 16.

    Freedman AN, Wideroff L, Olson L, et al. US physicians’ attitudes toward genetic testing for cancer susceptibility. Am J Med Genet Part A. 2003;120a(1):63–71.

    CAS  Article  Google Scholar 

  17. 17.

    Keating NL, Stoeckert KA, Regan MM, DiGianni L, Garber JE. Physicians' experiences with BRCA1/2 testing in community settings. J Clin Oncol. 2008;26(35):5789–96.

    Article  Google Scholar 

  18. 18.

    Nair N, Bellcross C, Haddad L, et al. Georgia primary care Providers' knowledge of hereditary breast and ovarian Cancer syndrome. J Cancer Educ. 2017;32(1):119–24.

    Article  Google Scholar 

Download references


Not applicable.


This study was funded by an American Cancer Society Cancer Control Career Development Award for Primary Care Physicians (LEP, grant number 130741-CCCDA-17-072-01-CCCDA); Breast Cancer Research Foundation BCRF-19-039 (SMD, HS, KS); Basser Center for BRCA (SMD, HS, KS); American Cancer Society Mentored Research Scholar Grants in Applied and Clinical Research (JGH, grant number MRSG-16-020-01-CPPB); National Cancer Institute P30 CA008748 (JGH, JL2, KM, KO), Sharon Levine Corzine Foundation (KO, KM, JL2). None of the funding bodies played a role in the design of the study or collection, analysis, or interpretation of data, or in writing the manuscript.

Author information




LEP and NLK conceived of and designed the provider incentive study. JL1, JL2, KM, AB, CJ, SMD, NT, BK, SCR, CG, YSL, JG, JGH, KO, DK, HS, KS contributed to the acquisition of the data. LEP, YSL and NLK analyzed and interpreted the data. LEP, YSL and NLK drafted the manuscript. BK, JG, KO, SMD, KM, JGH, HS, DK, JL1 substantively revised it. All authors have approved the submitted version and have agreed both to be personally accountable for the author’s own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Corresponding author

Correspondence to Lydia E. Pace.

Ethics declarations

Ethics approval and consent to participate

Ethics approval was obtained from Advarra (, the independent Institutional Review Board utilized for all study sites, with additional oversight by study institutions. Study participants undergoing BRCA1/2 testing provided written informed consent. Consent was implied for providers responding to the surveys.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pace, L.E., Lee, Y.S., Tung, N. et al. Comparison of up-front cash cards and checks as incentives for participation in a clinician survey: a study within a trial. BMC Med Res Methodol 20, 210 (2020).

Download citation


  • Clinician survey
  • Survey incentives
  • Response rate
  • Cash cards