Skip to main content

Evaluating strategies to recruit health researchers to participate in online survey research

Abstract

Background

Engaging researchers as research subjects is key to informing the development of effective and relevant research practices. It is important to understand how best to engage researchers as research subjects.

Methods

A 24 factorial experiment, as part of a Multiphase Optimization Strategy, was performed to evaluate effects of four recruitment strategy components on participant opening of an emailed survey link and survey completion. Participants were members of three US-based national health research consortia. A stratified simple random sample was used to assign potential survey participants to one of 16 recruitment scenarios. Recruitment strategy components were intended to address both intrinsic and extrinsic sources of motivation, including: $50 gift, $1,000 raffle, altruistic messaging, and egoistic messaging. Multivariable generalized linear regression analyses adjusting for consortium estimated component effects on outcomes. Potential interactions among components were tested. Results are reported as adjusted odds ratios (aOR) with 95% confidence intervals (95% CI).

Results

Surveys were collected from June to December 2023. A total of 418 participants were included from the consortia, with final analytical sample of 400 eligible participants. Out of the final sample, 82% (341) opened the survey link and 35% (147) completed the survey. Altruistic messaging increased the odds of opening the survey (aOR 2.02, 95% CI: 1.35–2.69, p = 0.033), while egoistic messaging significantly reduced the odds of opening the survey (aOR 0.56, 95%CI 0.38–0.75, p = 0.08). The receipt of egoistic messaging increased the odds of completing the survey once opened (aOR 1.81, 95%CI: 1.39–2.23, p < 0.05). There was a significant negative interaction effect between the altruistic appeal and egoistic messaging strategies for survey completion outcome. Monetary incentives did not a have a significant impact on survey completion.

Conclusion

Intrinsic motivation is likely to be a greater driver of health researcher participation in survey research than extrinsic motivation. Altruistic and egoistic messaging may differentially impact initial interest and survey completion and when combined may lead to improved rates of recruitment, but not survey completion. Further research is needed to determine how to best optimize message content and whether the effects observed are modified by survey burden.

Peer Review reports

Background

Engaging researchers as research subjects is key to advancing the development of effective and relevant research practices. To promote robust science, meta-research (“research-on-research”) uses an interdisciplinary approach to examine research practices with the same scientific rigor given to other areas of scientific inquiry [1]. Within many meta-research themes, data on—and the perspectives of—researchers themselves have an important role to play for both identifying potential areas for improvement, as well as developing solutions to the problems identified [2]. Online survey research is a popular and valuable means of collecting research data [3]. However, successful recruitment of researchers to be participants in research can be a challenge, with participation rates often lower than among the general population [4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19].

Researchers often must balance many competing priorities with an overarching goal of research impact, [20] which can produce significant mental workload and contribute to burnout [21,22,23,24,25,26,27,28]. These activities may include teaching, funding acquisition, administrative duties, scholarly activities, and—for scholar clinicians—patient care [21,22,23,24,25,26,27,28]. Thus, the high workload and a pro-social (i.e. an intent to benefit others) motivation to ‘make a difference’ may indicate that researchers have differing incentives for, and barriers to, participating in research than the average population. Although survey recruitment has been the focus of much investigation in general, [29] and among health care providers, [30,31,32,33,34,35,36] there has been limited research performed on how best to recruit researchers—and health researchers more specifically—to engage in research as participants themselves. To increase representation and continue to advance meta-research, it is important to understand how best to engage researchers as research subjects.

To address this gap, the objective of this study was to identify the recruitment strategy components that impact health researcher participation rates in online survey research. In general populations, financial incentives have been found to enhance recruitment beyond that of altruistic incentives, [37, 38] however educated professionals have been shown to be less influenced by small monetary incentives [39] and more pro-social groups are potentially more incentivized by altruistic appeals than the general population [39, 40]. Due to their high workload, pro-social nature, and anecdotal reports that researchers will often leave even high-value monetary rewards ‘on the table,’ we hypothesized that appealing to a researcher’s altruism to help a fellow researcher and their personal interest in the survey topic would have a greater impact on survey participation than monetary incentives.

To test this hypothesis, we performed a randomized factorial experiment as part of the recruitment for the first wave of a longitudinal survey study of health researchers exploring collaborative research behaviors. The goal of using this factorial study approach was to delineate the impact of different engagement strategies on survey response to inform best messaging practice in subsequent follow-up surveys.

Methods

We used a 24 factorial experiment as part of a Multiphase Optimization Strategy (MOST) [41] to determine the recruitment strategy components most effective for engaging health researchers in an online survey among members of three research consortia. The use of a factorial experiment as part of the MOST framework allows investigators to assess the effect of each component of a program simultaneously with a much smaller sample size than would occur in randomized controlled trials for each strategy [41]. The experiment tested the effect of four distinct recruitment strategy components, as well as their interactions, on the opening and completion of the survey within the study population of health researchers. All study procedures were approved by the NYU Langone Health institutional review board (Study# 22-01099) who provided a waiver of informed consent for this study.

Study setting and participants

Participants were members from three national health research consortia based in the United States. Research consortia are structured networks that bring together researchers from multiple institutions to participate in cooperative research efforts on particular focus areas. Health researchers were broadly defined as anyone who designated themselves as a researcher, and were not limited by academic degree (e.g. PhD, MD, etc.) or whether or not they were practicing clinicians. Participating consortia were selected based on convenience and consortium topical focus. Two consortia were based within the Veterans Health Administration, one focused on telehealth expansion (Consortium A) and the other focused on telehealth for cancer (Consortium B). The third one (Consortium C) was funded by the National Institutes of Health and focused on older adults with multiple chronic conditions. To improve survey relevance on the participant level, the Consortium C scholars’ program members were selected as a subset of the total Consortium C membership for participation. Potential survey participants were identified from consortium members on the consortium email distribution lists provided by the leadership of each consortium. To be included in the study an individual needed to: (1) have a valid email address on a consortium distribution list, (2) be a current consortium member, and (3) perform research activities. Those who perform only administrative functions (n = 18) within the consortium (e.g. administrative assistant on the mailing list to manage a calendar) were excluded from analyses rendering an analytical sample of n = 400. Excepting consortium affiliation, no demographic information was available regarding consortium participants prior to survey. The online survey was performed using Qualtrics’ XM platform [42] and took approximately 20 min on average to complete.

Study design

In our 24 factorial experiment, a stratified simple random sample was used to assign potential survey participants, based on the email distribution lists, to one of 16 recruitment scenarios that included a combination of standard messaging (as required by the institutional review board) and the four recruitment strategy components. Randomization was stratified by consortium; the randomization strategy was designed to ensure an equal number of participants assigned to each recruitment scenario within each participating consortium.

Potential participants were sent a survey recruitment email followed by follow-up reminder emails three times spaced approximately two weeks apart, or until the survey was completed, for a maximum of four email communications. All recruitment emails included components required by the ethics approval board including a study description, key participant information (e.g. voluntary nature of study), and a personalized survey link, as well as appropriate text for each recruitment scenario. The initial email and follow-up email text was identical outside of an introductory sentence included to indicate that the email was a follow-up from a previous email. It was not possible to determine which email wave led to clicking on the survey link.

Recruitment strategy components

Potential survey participants received a randomly selected recruitment scenario that included a combination of standard messaging and up to four recruitment strategy components addressing factors associated with messaging content and incentives. Each recruitment strategy component had two levels, with a component only present or absent, producing a total of 16 combinations of recruitment components, including one scenario of standard messaging alone. Recruitment strategy components were intended to address both intrinsic and extrinsic sources of motivation, including: (1) standard messaging, (2) monetary gift ($50) incentive (extrinsic motivation), (3) monetary raffle ($1,000) incentive (extrinsic motivation), (4) personal appeal messaging (altruistic messaging), and (5) potential gain messaging (egoistic messaging). All potential participants received the standard messaging as part of the recruitment email. One-sixteenth (N ~ 26) of potential participants were randomly assigned to each of the 16 possible messaging combinations, including one scenario with the standard messaging alone (i.e., core component sent to all participants) and 15 recruitment strategy scenarios with a combination of at least one, and as many as four, recruitment components per scenario. Each recruitment component was present in eight of the 16 recruitment scenarios. Details of the component combinations included in the 16 recruitment strategy scenarios can be found in Supplement Table S1.

Standard messaging included an email with basic required study information including the study purpose and ethics disclosures, as well as one-time consortium leadership promotional messaging (i.e. an official notice from consortium leadership via separate email or newsletter that the survey will be occurring). The stated study purpose was for the survey more generally, and did not disclose the factorial study recruitment randomization. With the purpose of eliciting extrinsic motivation, two recruitment components provided monetary incentive for survey participation. The monetary gift recruitment component included the promise of a $50 gift certificate upon survey completion. The monetary raffle recruitment component included the opportunity to be entered in a drawing for a $1,000 check upon survey completion. The altruistic appeal messaging was designed to highlight the personal importance of survey participation to the survey investigator. The egoistic messaging was designed to highlight the potential (non-monetary) benefits to the participant for completing the survey. The recruitment email wording was held constant throughout each follow-up email, excluding the greeting, which was varied to inform potential participants of the follow-up nature of the email. See Supplement Table S2 for recruitment email wording associated with each recruitment strategy component.

Outcome measures

The two outcome measures included survey opened and survey completed. An opened survey was defined as a survey linked having been used to open the survey. This was readily captured via Qualtrics distribution analytics, which records survey progress when using personalized survey links. A completed survey was defined as a survey with at least 75% of questions answered. Survey completion was only calculated among participants with an opened survey.

Statistical analysis

Descriptive statistics (mean, median, frequencies, percentages) were calculated to characterize the study sample overall and by recruitment strategy component assignment. We used the Chi-square test to compare the proportion of individuals by consortium membership in each group. To compare the effect of each recruitment component on outcome measures, we performed multivariable logistic regression analyses controlling for consortium membership. The reference group for these models was all scenarios that did not include the component of interest. The control scenario (absence of all components) was tested as a separate model. Potential interactions between components were tested for all components and secondary regression analyses were reported stratified by the presence or absence of a component. Results are reported as adjusted odds ratios (aOR) with 95% confidence intervals (95% CI). All analyses were performed in R [43]. R package emmeans was used for assessing conditional effects when interactions were found [44, 45].

Results

Surveys were collected from June to December 2023. A total of 418 (consortium A = 269; consortium B = 93; consortium C = 56) potential participants were included on distribution lists provided by consortium leadership. Eighteen entries were considered not valid participants and removed from analyses due to a bounced email (10) or personal communication indicating non-research participation in the consortium (8). Of the remaining 400 participants, 82% (341) opened the survey link and 35% (147) completed the survey (Table 1). Of those who completed the survey, participants received an average 1.5 ± 0.8 recruitment emails, had a median age of 43 [24, 70] years, 66.2% were female, 58.8% non-Hispanic White, 23.7% Asian, 6.8% Hispanic, 4.7% non-Hispanic Black, and 6% other race. The median number of years of research experience was 12 [1, 45], 60.9% had a PhD, 33.8% a MD, and 3.4% had an MD/PhD.

Overall, members of Consortium A were significantly less likely to have opened the survey link (OR 0.23, 95%CI: 0.08–0.54, p = 0.002) than members of the other consortia. Whereas, members of Consortium B were significantly more likely to complete the survey (OR 2.16, 95% CI: 1.10–4.30, p = 0.027) than members of the other consortia. There were no statistically significant differences in distribution of consortium membership or email validity between recruitment strategy assignments.

Table 1 Participant characteristics and outcomes by strategy assignment

The effect of each recruitment strategy on opening and completing the survey and the effects when considering interactions with other recruitment strategies can be found in Table 2. Considering all strategies and their interactions, when it comes to opening the survey, an altruistic appeal increased the odds (aOR 2.02, 95% CI: 1.35–2.69, p = 0.033), and, with a 95%CI that did not embrace the null, egoistic messaging decreased the odds (aOR 0.56, 95%CI 0.38–0.75, p = 0.08) of opening the survey. Significant interactions were identified. The presence of the $50 gift decreased the effect of the altruistic messaging on the odds of opening the survey, with a 95%CI that did not embrace the null (aOR 1.90, 95%CI 1.01–2.79, p = 0.170 vs. aOR 2.15, 95%CI 1.15–3.15, p = 0.098). The $1000 raffle financial incentive decreased the effect of altruistic messaging on the odds of opening the survey (raffle present: aOR 1.48, 95%CI 0.81–2.16, p = 0.391 vs. raffle absent: aOR 2.77, 95%CI 1.45–4.09, p = 0.032). Presence of the $1000 raffle financial incentive further decreased the effect of egoistic messaging on the odds of opening the survey (raffle present: aOR 0.34, 95%CI 0.19–0.50, p = 0.019 vs. raffle absent: aOR 0.92, 95%CI 0.48–1.36, p = 0.862). On the other hand, the presence of egoistic messaging increased the effect of altruistic appeal messaging on the the odds of opening the survey (egoistic present: aOR 2.74, 95% CI: 1.52–3.96, p = 0.023 vs. egoistic absent: aOR 1.49, 95%CI 0.76–2.22, p = 0.409).

Table 2 Effect of recruitment strategy on opening and completing survey

When considering completing the survey once opened, receiving only control messaging decreased the odds (aOR 0.27, 95% CI: 0.12–0.43, p = 0.022) and the receipt of egoistic messaging increased the odds (aOR 1.81, 95%CI: 1.39–2.23, p = 0.010) of completing the survey. Significant interaction effects were observed. The presence of the $50 gift increased the effect of egoistic messaging on the odds of completing the survey (gift present: aOR 2.06, 95%CI 1.40–2.72, p = 0.023 vs. gift absent: aOR 1.58, 95%CI 1.05–2.11, p = 0.167). There was a significant interaction effect between the altruistic appeal and egoistic messaging strategies for the survey completion outcome, where the presence of both strategies diminished effects of both the altruistic appeal, with a confidence interval that did not embrace the null, (egoistic present: aOR 0.53, 95% CI 0.35–0.68, p = 0.053 vs. egoistic absent: aOR 1.73, 95%CI 1.14 − 2.21, p = 0.089) and egoistic messaging (altruistic present: aOR 1.00, 95%CI 0.68–1.32, p = 0.995 vs. altruistic absent: aOR 3.29, 95%CI 2.19–4.437, p < 0.001).

There were no significant three- or four-way interactions. The effect of recruitment strategy on opening or completing the survey did not significantly vary by consortium membership.

Discussion

Using a MOST strategy, this is one of the first studies evaluating the effectiveness of recruitment strategy components for engaging health researchers in an online survey. Overall, at approximately 37%, this study had an approximately average response rate for an online survey [8]. However, this response rate represents an above average rate as compared to multiple studies recently performed among researchers [9,10,11,12,13,14,15,16,17,18,19, 46]. As hypothesized, in this study we found that recruitment strategies including intrinsic motivational messaging were more likely to have an impact on health researchers’ likelihood of opening and completing an online survey as compared to extrinsic motivators. The inclusion of small-to moderate sized monetary incentives had minimal impact on participation, possibly indicating that health researchers are not primarily motivated to participate in research by extrinsic factors due to their pro-social motivations and high number of competing priorities. Egoistic and altruistic appeal messaging, however did not have consistent effects on both the opening and completion of the online survey suggesting additional complexity in health researcher participation motivations requiring further investigation. This research informs the best practice of using messaging when trying to engage researchers as research participants, adding an important yet overlooked component in meta-research evaluation.

The recruitment strategies using messaging intended to generate intrinsic motivation had a greater impact on survey participation than extrinsic motivation strategies, which is not entirely consistent with research in the general population [37]. Nevertheless, incentives have been found to undermine motivation in survey response in studies where intrinsic motivation is already high [47]. The two types of intrinsic motivation tested, altruistic and egoistic messaging, however, had disparate and inconsistent impact on both the likelihood of opening and of completing the survey. Highlighting the potentially complicated nature of motivation in survey response rates and emphasizing the need for more in-depth investigation of intrinsic motivation within the health researcher population. Future qualitative inquiry may help understand how these domains may impact engagement in survey research differently.

As observed among other college-educated professionals, [39] an altruistic appeal had a positive effect on the likelihood of health researchers opening the survey link. This effect, however did not remain significant for survey completion among those who had opened the survey link. The egoistic messaging—which has also been previously observed to have a positive impact on response rates [48]—however, had an opposite pattern of effect. In our study, the egoistic messaging significantly decreased the likelihood of opening the survey, however, among those who did open the survey, the egoistic messaging increased the likelihood of survey completion. The survey included in this study was relatively high burden; taking approximately 20 min to complete. This high burden may account for the high drop off rate between opening and completing the survey. This may have created a dynamic where an initial altruistic motivation was not strong enough to overcome the high time burden. The egoistic messaging, however, may have initially filtered out more individuals not interested in the survey topic, leaving those with an interest more likely to complete the survey. This suggests that different intrinsic motivational types may be more appropriate for low and high burden studies. More research is needed to understand the circumstances under which each motivational type is more impactful and whether alternative messaging phrasing will alter the observed associations.

When both included as components in the recruitment strategy, the altruistic and egoistic messaging created a unique set of interactions. With regard to opening the survey link, the positive effect of the altruistic messaging was strengthened by the presence of the egoistic messaging and the negative effect of the egoistic messaging was rendered no longer significant when paired with the altruistic messaging. This suggests that including both types of motivation together may increase the likelihood of an initial interest in opening the survey. The presence of both messaging types, however, did not interact similarly to impact survey completion and including both together lessened the impact of egoistic messaging on survey completion and generated a negative impact of altruistic messaging on completion. Interestingly, the control strategy, where no direct motivation (intrinsic nor extrinsic) was presented, individuals were not significantly affected for choosing to open the survey, however those individuals were significantly less likely to complete the survey. Due to the antagonistic effects of egoistic and altruistic messaging on survey completion, it may be appropriate to initiate recruitment with an altruistic appeal. Individuals’ engagement driven by the different motivation types may interact with other unmeasured factors such as tolerance for survey burden or personal interests. Thus, future research may consider using adaptive recruitment designs to provide a different recruitment strategy if a participant is a non-responder, or does not complete the survey.

The minimal response to monetary incentives is consistent with prior research among health researchers, [46] but contrary to prior research in other populations, which has shown a positive correlation between survey participation and the provision of larger monetary incentives in the general population [37]. Even small monetary incentives in, for example, physician mailed surveys were found to be effective [49]. However, our study population was slightly less than half physicians (i.e. have an MD) and thus may indicate that health researchers (whether MD or PhD) are likely to have different work-related responsibilities and differing motivations for participation than their clinician-only counterparts. At $50 and $1,000, the monetary incentives offered in this study could be considered “larger” among many populations, however, it is possible that within this potentially higher-earning population (i.e. clinicians, university faculty, etc.)—who may also be more pressed for time—a higher monetary reward may be required to incentivize participation or may not impact their engagement. While additional research is needed to understand the effective value where a motivating monetary threshold exists, it also raises the concern of budgetary feasibility of relying on even higher monetary incentives to recruit researchers. Indeed, even strategies that provide the potential for a significantly larger monetary reward, but at a feasible cost to the study (i.e. raffles), have not been successful among similar populations, with even raffles as large as $5,000 found to have minimal effect on survey participation among clinicians and college educated professionals [39, 50]. Similarly, while shown to yield superior response rates compared with conditional cash incentives paid after physicians respond to a survey, [51, 52] due to resource and logistical constraints, up-front unconditional cash incentives were not tested in this analysis. Further research comparing varying levels, and different modalities, of monetary incentives is needed to understand their full impact on engagement in this population.

Furthermore, it is possible that providing monetary incentives could actually undermine motivation to open a survey link among some health researchers when included alongside strategies of intrinsic motivation. Concordant with previously observed phenomena in behavioral research, [47, 48, 53] the inclusion of a $1,000 raffle in the recruitment strategy eliminated the significant positive impact of including altruism messaging and further increased the negative effect of egoistic messaging on the odds of opening the survey link. This effect, however, was not similarly observed for the rate of survey completion after the survey link was opened. Evidence about the impact of raffles has been mixed in the literature, however, direct incentives generally are more effective that the sweepstakes approach for boosting survey response [54].

Limitations

This study had several limitations. First, the choice of email distribution led to some messages being flagged by spam filters depending on organizational settings. While corrective measures were pursued (i.e. resending messages individually as opposed to from the automated distribution list) the impact of spam filtering is unknown, likely contributing to lower rates of survey receipt that likely did not occur at random. Similarly, it was not possible to track who saw the consortium-leadership-provided prenotification and therefore the interaction of that notification with each strategy component is not known. Furthermore, excluding the greeting, the text of the recruitment e-mail was held constant throughout each follow-up. Varying messages across contacts has been shown to increase response rates, [55] therefore the lack of message variation may have diminished the observed effect of the messaging scenarios on response rates as compared to other studies that use the variation method. Second, this analysis did not assess the impact of recruitment strategy components on survey efficiency. Future research may benefit from examining the impact of recruitment strategy components on time to response. Additionally, the study did not examine multiple levels of financial incentives or alternate variations of altruistic/egoistic messages. Therefore, the effect sizes observed may not be reflective of all possible types of extrinsic and intrinsic motivational recruitment strategies. Third, beyond consortium membership, participant demographics of non-responders are unknown, limiting the ability to create generalizations about the type of health researcher to whom these findings apply and made survey nonresponse bias analyses infeasible. Fourth, as national entities, the consortia included in this study were not 100% mutually exclusive and may have had some overlap in members artificially reducing survey response or providing one individual with two different recruitment strategies. While overlap was rare (< 15 known) it may have reduced observed effects. Finally, this study only used online surveys, which tend to have lower response rates and potentially have more response bias than other survey methodologies, [8] thus caution should be used when comparing the results of our findings with the existing literature of other modalities of survey research among similar populations such as mailed surveys.

Conclusion

When using online surveys among researchers, intrinsic motivation is likely to be a greater driver of health researcher participation in survey research than extrinsic motivation. Altruistic and egoistic messaging may differentially impact initial interest and ultimate survey completion and when combined may lead to improved rates of recruitment, but not survey completion. Further research is needed to determine how to best optimize message content and whether the effects observed are modified by survey burden.

Data availability

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

MOST:

Multiphase Optimization Strategy

aOR:

Adjusted odds ratio

95%CI:

95% confidence interval

References

  1. Ioannidis JP, Fanelli D, Dunne DD, Goodman SN. Meta-research: evaluation and improvement of Research methods and practices. PLOS Biol. 2015;13(10):e1002264.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Harpe SE. Meta-research in pharmacy: time for a look in the mirror. Res Social Adm Pharm. 2021;17(12):2028–35.

    Article  PubMed  Google Scholar 

  3. Evans JR, Mathur A. The value of online surveys: a look back and a look ahead. Internet Res. 2018;28:854–87.

    Article  Google Scholar 

  4. Nulty DD. The adequacy of response rates to online and paper surveys: what can be done? Assess Eval High Educ. 2008;33(3):301–14.

    Article  Google Scholar 

  5. Bettaieb J, Cherif I, kharroubi G, Mrabet A. Attitudes towards plagiarism among academics of the faculty of Medicine of Tunis. Acc Res. 2020;27(8):521–37.

    Article  Google Scholar 

  6. Baruch Y, Holtom BC. Survey response rate levels and trends in organizational research. Hum Relat. 2008;61(8):1139–60.

    Article  Google Scholar 

  7. Cycyota CS, Harrison DA. What (not) to expect when surveying executives: a meta-analysis of top manager response rates and techniques over time. Organ Res Methods. 2006;9(2):133–60.

    Article  Google Scholar 

  8. Wu M-J, Zhao K, Fils-Aime F. Response rates of online surveys in published research: a meta-analysis. Computers Hum Behav Rep. 2022;7:100206.

    Article  Google Scholar 

  9. Cherney A, Head B, Povey J, Boreham P, Ferguson M. The utilisation of social science research – the perspectives of academic researchers in Australia. J Sociol. 2015;51(2):252–70.

    Article  Google Scholar 

  10. Haven TL, Tijdink JK, Pasman HR, Widdershoven G, ter Riet G, Bouter LM. Researchers’ perceptions of research misbehaviours: a mixed methods study among academic researchers in Amsterdam. Res Integr Peer Rev. 2019;4(1):25.

    Article  PubMed  PubMed Central  Google Scholar 

  11. van Rijnsoever FJ, Hessels LK. How academic researchers select collaborative research projects: a choice experiment. J Technol Transf. 2021;46(6):1917–48.

    Article  Google Scholar 

  12. Stevens ER, Shelley D, Boden-Albala B. Unrecognized implementation science engagement among health researchers in the USA: a national survey. Implement Sci Commun. 2020;1:39.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Børing P, Flanagan K, Gagliardi D, Kaloudis A, Karakasidou A. International mobility: findings from a survey of researchers in the EU. Sci Public Policy. 2015;42(6):811–26.

    Google Scholar 

  14. Phillips WR, Sturgiss E, Hunik L, Glasziou P, Hartman To, Orkin A, et al. Improving the reporting of primary care research: An international survey of researchers. J Am Board Family Med. 2021;34(1):12.

    Article  Google Scholar 

  15. Macdonald S, Jarvis L, Lavis SM. Cyberterrorism Today? Findings from a follow-on survey of researchers. Stud Confl Terrorism. 2022;45(8):727–52.

    Article  Google Scholar 

  16. Aas SN, Distefano MB, Pettersen I, Gravrok B, Nordvoll LY, Bjaastad JF, Grimsgaard S. Patient and public involvement in health research in Norway: a survey among researchers and patient organisations. Res Involv Engagem. 2023;9(1):48.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Swauger S, Vision TJ. What factors influence where researchers deposit their data? A survey of researchers submitting to data repositories. Int J Digit Curation. 2015;10(1):68–81.

    Article  Google Scholar 

  18. Tawfik GM, Giang HTN, Ghozy S, Altibi AM, Kandil H, Le H-H, Eid PS, Radwan I, Makram OM, Hien TTT, et al. Protocol registration issues of systematic review and meta-analysis studies: a survey of global researchers. BMC Med Res Methodol. 2020;20(1):213.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Kelly DC, Agnes G, Manoj ML, Danielle BR, Hana R, David M. Knowledge and motivations of researchers publishing in presumed predatory journals: a survey. BMJ Open. 2019;9(3):e026516.

    Article  Google Scholar 

  20. Martin BR. The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster? Res Evaluation. 2011;20(3):247–54.

    Article  Google Scholar 

  21. Surratt CK, Kamal KM, Wildfong PLD. Research funding expectations as a function of faculty teaching/administrative workload. Res Social Adm Pharm. 2011;7(2):192–201.

    Article  PubMed  Google Scholar 

  22. Nassar AK, Waheed A, Tuma F. Academic clinicians’ workload challenges and burnout analysis. Cureus 2019, 11(11).

  23. Akca M, Küçükoğlu MT. Relationships Between Mental Workload, Burnout, and Job Performance: A Research Among Academicians. In: Research Anthology on Changing Dynamics of Diversity and Safety in the Workforce edn. Edited by Management Association IR. Hershey, PA, USA: IGI Global; 2022: 877–897.

  24. Rafsanjani MA, Pamungkas HP, Prakoso AF, Sholikhah N. Does teacher-researcher role conflict influence psychological well-being among the lecturers. Tadris: Jurnal Keguruan Dan Ilmu Tarbiyah. 2020;5(2):287–96.

    Article  Google Scholar 

  25. Mortier A, Levecque K, Wille L. Under pressure? Doctorate holders’ satisfaction with their workload in academia and beyond. ECOOM Briefs. 2022;(40).

  26. Boitet LM, Meese KA, Colón-López A, Schwiebert LM, Rogers DA. An investigation of Organizational correlates of Distress in Non-clinician Biomedical Researchers in the United States. J Multidisciplinary Healthc. 2023;16(null):333–43.

    Article  Google Scholar 

  27. Cagnazzo C, Campora S, Pirondi S, Guarrera A, Nuzzo A, Gentili G, Taverniti C, Manuela M, Filippi R. 1433P - burnout syndrome: what impact on clinical research? Ann Oncol. 2017;28:v510.

    Article  Google Scholar 

  28. Herbert DL, Coveney J, Clarke P, Graves N, Barnett AG. The impact of funding deadlines on personal workloads, stress and family relationships: a qualitative study of Australian researchers. BMJ Open. 2014;4(3):e004462.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Lane TS, Armin J, Gordon JS. Online recruitment methods for web-based and mobile health studies: a review of the literature. J Med Internet Res. 2015;17(7):e183.

    Article  PubMed  PubMed Central  Google Scholar 

  30. McLeod CC, Klabunde CN, Willis GB, Stark D. Health care provider surveys in the United States, 2000–2010: a review. Eval Health Prof. 2013;36(1):106–26.

    Article  PubMed  Google Scholar 

  31. Cho YI, Johnson TP, VanGeest JB. Enhancing surveys of Health Care professionals:a Meta-analysis of techniques to improve response. Eval Health Prof. 2013;36(3):382–407.

    Article  PubMed  Google Scholar 

  32. Martins Y, Lederman RI, Lowenstein CL, Joffe S, Neville BA, Hastings BT, Abel GA. Increasing response rates from physicians in oncology research: a structured literature review and data from a recent physician survey. Br J Cancer. 2012;106(6):1021–6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  33. Cook JV, Dickinson HO, Eccles MP. Response rates in postal surveys of healthcare professionals between 1996 and 2005: an observational study. BMC Health Serv Res. 2009;9(1):160.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Hutchinson MK, Sutherland MA. Conducting surveys with multidisciplinary health care providers: current challenges and creative approaches to sampling, recruitment, and data collection. Res Nurs Health. 2019;42(6):458–66.

    Article  PubMed  Google Scholar 

  35. James KM, Ziegenfuss JY, Tilburt JC, Harris AM, Beebe TJ. Getting physicians to respond: the impact of incentive type and timing on physician survey response rates. Health Serv Res. 2011;46(1 Pt 1):232–42.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Brtnikova M, Crane LA, Allison MA, Hurley LP, Beaty BL, Kempe A. A method for achieving high response rates in national surveys of U.S. primary care physicians. PLoS ONE. 2018;13(8):e0202755.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Sammut R, Griscti O, Norman IJ. Strategies to improve response rates to web surveys: a literature review. Int J Nurs Stud. 2021;123:104058.

    Article  PubMed  Google Scholar 

  38. Singer E, Ye C. The Use and effects of incentives in surveys. ANNALS Am Acad Political Social Sci. 2012;645(1):112–41.

    Article  Google Scholar 

  39. Conn KM, Mo CH, Sellers LM. When less is more in boosting Survey Response Rates*. Soc Sci Q. 2019;100(4):1445–58.

    Article  Google Scholar 

  40. Kam CD, Wilking JR, Zechmeister EJ. Beyond the narrow data base: another convenience sample for experimental research. Polit Behav. 2007;29:415–40.

    Article  Google Scholar 

  41. Collins LM. Optimization of behavioral, biobehavioral, and biomedical interventions: the multiphase optimization strategy (MOST). Springer; 2018.

  42. Qualtrics XM. The leading experience management software. https://www.qualtrics.com/.

  43. R: A language and environment for statistical computing. https://www.R-project.org/.

  44. Searle SR, Speed FM, Milliken GA. Population marginal means in the Linear Model: an alternative to least squares means. Am Stat. 1980;34(4):216–21.

    Article  Google Scholar 

  45. emmeans. Estimated marginal means, aka least-squares means. https://CRAN.R-project.org/package=emmeans.

  46. Wilson PM, Petticrew M, Calnan M, Nazareth I. Effects of a financial incentive on Health Researchers’ response to an online survey: a Randomized Controlled Trial. J Med Internet Res. 2010;12(2):e13.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Wenemark M, Vernby Å, Norberg AL. Can incentives undermine intrinsic motivation to participate in epidemiologic surveys? Eur J Epidemiol. 2010;25(4):231–5.

    Article  PubMed  Google Scholar 

  48. Pedersen MJ, Nielsen CV. Improving Survey Response Rates in Online panels:effects of low-cost incentives and cost-free text appeal interventions. Social Sci Comput Rev. 2016;34(2):229–43.

    Article  Google Scholar 

  49. VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof. 2007;30(4):303–21.

    Article  PubMed  Google Scholar 

  50. Halpern SD, Kohn R, Dornbrand-Lo A, Metkus T, Asch DA, Volpp KG. Lottery-based versus fixed incentives to increase clinicians’ response to surveys. Health Serv Res. 2011;46(5):1663–74.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Leung GM, Johnston JM, Saing H, Tin KY, Wong IO, Ho L-M. Prepayment was superior to postpayment cash incentives in a randomized postal survey among physicians. J Clin Epidemiol. 2004;57(8):777–84.

    Article  PubMed  Google Scholar 

  52. Wiant K, Geisen E, Creel D, Willis G, Freedman A, de Moor J, Klabunde C. Risks and rewards of using prepaid vs. postpaid incentive checks on a survey of physicians. BMC Med Res Methodol. 2018;18:1–6.

    Article  Google Scholar 

  53. Deci EL, Koestner R, Ryan RM. A meta-analytic review of experiments examining the effects of extrinsic rewards on intrinsic motivation. Psychol Bull. 1999;125(6):627–68. discussion 692–700.

    Article  CAS  PubMed  Google Scholar 

  54. LaRose R, Tsai H-yS. Completion rates and non-response error in online surveys: comparing sweepstakes and pre-paid cash incentives in studies of online behavior. Comput Hum Behav. 2014;34:110–9.

    Article  Google Scholar 

  55. Dillman DA. Mail and internet surveys: the tailored design method. 2nd ed. Hoboken, NJ, US: John Wiley & Sons Inc; 2007.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Funding for this research is provided by the National Institutes on Aging (1 K01 AG075169-01A1 NIA). The funder has no role in preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

ERS conceived of the study, wrote the manuscript, and performed data analyses. CMC contributed to statistical analyses and manuscript revision. AS aided in software usage and data acquisition, and contributed to manuscript revision. OES contributed to manuscript writing and revisions. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Elizabeth R. Stevens.

Ethics declarations

Ethics approval and consent to participate

All study procedures were approved by the NYU Langone Health institutional review board (Study# 22-01099) who provided a waiver of informed consent for this study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stevens, E.R., Cleland, C.M., Shunk, A. et al. Evaluating strategies to recruit health researchers to participate in online survey research. BMC Med Res Methodol 24, 153 (2024). https://doi.org/10.1186/s12874-024-02275-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12874-024-02275-6

Keywords