Skip to main content

Participants who were difficult to recruit at baseline are less likely to complete a follow-up questionnaire – results from the German National Cohort



With declining response proportions in population-based research the importance of evaluating the effectiveness of measures aimed at improving response increases. We investigated whether an additional flyer with information about the study influences participation in a follow-up questionnaire and the time participants take to send back filled questionnaire.


In a trial embedded within the German National Cohort we compared responses to invitations for a follow-up questionnaire either including a flyer with information about the cohort study or not including it. Outcomes of interest were participation in the follow-up (yes vs. no) and time to response (in days). We analyzed paradata from baseline recruitment to account for differences in recruitment history between participants.


Adding a flyer to invitations did neither influence the likelihood of participation in the follow-up (OR 0.94, 95% CI: 0.80, 1.11), nor the time it took participants to return completed questionnaires (β̂ = 1.71, 95% CI: − 1.01, 4.44). Subjects who, at baseline, needed to be reminded before eventually participating in examinations and subjects who scheduled three or more appointments until eventually completing baseline examinations were less likely to complete the follow-up questionnaire and, if they did, took more time to complete questionnaires.


Evaluating the effectiveness of measures aimed at increasing response can help to improve the allocation of usually limited resources. Characteristics of baseline recruitment can influence response to follow-up studies and therefore information about recruitment history (i.e., paradata) might prove useful to tailor follow-up recruitments to those who were difficult to recruit during baseline. To this end, however, it is necessary to routinely and meticulously collect paradata during recruitment.

Peer Review reports


Population-based research is increasingly challenged by decreasing response proportions [1,2,3,4], threatening the generalizability of studies due to possibly biased estimates [5, 6]. It has been shown that already technical details of the delivery can influence participation [6] and, consequently, it is important to evaluate whether measures taken to increase response indeed accomplish the desired effect and/or whether previous results transfer to other cultural contexts or across social changes over time. In this trial conducted within the German National Cohort (GNC, German: NAKO Gesundheitsstudie [7]), we investigated whether an additional flyer with information about the study influences participation in a follow-up questionnaire and the time participants take to send back the filled questionnaire.

An additional challenge for longitudinal studies is that participants who were more difficult to recruit initially (so called late respondents) are more likely to drop out in later stages of the study [8,9,10,11], that is, additional efforts spent at baseline to increase response and representativeness are possibly not rewarded at follow-ups. To investigate whether characteristics of recruitment during baseline influenced participation in the follow-up questionnaire, our analyses also included explanatory variables derived from paradata, i.e., detailed information about the recruitment process [12] (see methods for details).


The GNC is a cohort study investigating causes of major chronic diseases that is conducted in 18 regional study centers across Germany [7]. Baseline examinations conducted from 2014 to 2019 included a total of 205,217 participants aged 20–69 randomly drawn from regional registries of residents. In the study center of Bremen, where this trial was conducted, 10,486 participants were examined. Examinations included computer-assisted personal face-to-face interviews, a number of standardized physical and medical examinations, the collection of various biomaterials, and self-completion questionnaires (“Level 1” protocol). A random sub-sample of 20% completed an extended protocol including more in-depth physical and medical examinations (“Level 2” protocol). All participants will be re-invited to a re-assessment after 4–5 years. In addition, all participants will be re-contacted every 2–3 years and asked to fill in questionnaires about changes in lifestyle, the occurrence of diseases, and other characteristics. A detailed study protocol can be found elsewhere [7, 13], as well as a detailed description of the recruitment protocol during baseline [14].

All procedures performed in the GNC were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The study was approved by the Ethics committee of the local chamber of physicians in Bremen (Bremen Medical Association, reference number RE/HR-388). Written informed consent was obtained from all participants included in the GNC.

The current trial ran from April 2018 to February 2019 within the first round of follow-up questionnaires in Bremen’s study center. Participants were invited to fill a 16-pages paper & pencil questionnaire inquiring about, for instance, their general health status, height and weight, selected disease symptoms, use of medication, smoking, menopausal status, and the occurrence of diseases (diagnosed by a physician). Invitations were sent by traditional mail and included a pre-stamped return envelope, which participants could use to return the questionnaire. If a person did not respond within three weeks, a reminder letter was sent, after another two weeks followed by up to five phone attempts over a span of four weeks, and, finally, a second reminder letter was sent. The recruitment was controlled and documented with MODYS, a dedicated software for epidemiological field studies and paradata collection (for a detailed description see [15]). MODYS schedules recruitment tasks according to a predefined recruitment protocol, provides a mail merge system to generate and print study documents (e.g., letters, invitations), and logs and time-stamps all completed actions (e.g., outbound letters and emails, passed waiting periods). In addition, the field staff uses MODYS to log and time-stamp all attempted and successful interactions with potential participants (inbound and outbound phone calls, inbound letters and emails), as well as other recruitment events (e.g., issuing of dropout codes, corrections of contact data, completion of examinations).

In addition to the questionnaire, invitations could include a leaflet (flyer) informing about the questionnaire, reporting first empirical results from baseline examinations (hand grip strength stratified by sex), informing about the olfactory function test conducted at baseline, and about reasons for the recent change of the cohort’s German name from GNC to NAKO (see additional file 1 for the German flyer and an English translations of its contents). The flyer “NAKO update” is regularly published by the public relations office of the GNC study and provided to all study centers with the encouragement to include it in any written communications with participants. The current trial investigated whether this flyer influenced response to the postal questionnaire.

A total of 3275 participants was randomized to receive an invitation either including the flyer (group flyer, N = 1648) or not including it (group no-flyer, N = 1627). To be able to detect a deviation of ±5 percentage points from the assumed base response of 75% with a power of 0.85 the sample size was set to at least N = 2938 based on a-priori power analyses (two-tailed). Participants were added to the trial in order of their invitation until the predefined sample size was reached.

Participants were eligible for this trial if they took part in the baseline examination and the examination dated back at least two years at the time of invitation (examinations between April 2014 and February 2017). Invitations were sent out according to the normal mailing schedule of the study center (usually on Tuesdays, Wednesdays, and Thursdays) and the number of invitations sent out per week was pre-determined by the number of examinations completed per week two years earlier (about 50–60 per week). Due to a backlog at the beginning of the trial, the number of invitations per week could be as high as 500 during the first weeks of the trial. Invitation letters were prepackaged by persons not involved in the day-to-day recruitment to keep regular field staff responsible for contact with participants blinded to group assignments.

Participants were excluded from the trial if they had deceased since participating at baseline (N = 6), revoked their consent after receiving the invitation (N = 1), if letters were returned as undeliverable (N = 7), if paradata included follow-up recruitment events before the trial started (e.g., previous follow-up invitations sent to invalid addresses; N = 158), or because additional invitations were sent out mistakenly not matching their initial group assignment (N = 45). The final analysis group totaled 3058 participants (flyer: N = 1532; no-flyer: N = 1526; see Fig. 1 for a flow chart).

Fig. 1

Allocation of participants

Outcome of interest was participation in the follow-up questionnaire (yes vs. no) for the main analysis and time to response in days for the second analysis (i.e., time between mailing of the invitation letter and return of the filled questionnaire). The outcome was assessed by the field staff involved in the day-to-day recruitment by scanning barcodes on returned questionnaires using the MODYS software, resulting in time stamped database entries. Exposure variables were invitation group (flyer vs. no-flyer), sex (female vs. male), age (categories: 20–29, 30–39, 40–49, 50–59, and ≥ 60 years), and nationality (German vs. non-German, as provided by the registry of residents). Since recruitment intensity during baseline may influence participation in follow-ups [8,9,10,11, 16] additional variables characterizing baseline recruitment were also included: number of reminder letters (0, 1, 2, ≥3), number of appointments made until eventually participating (1, 2, ≥3), availability of phone number in public phone records before baseline recruitment (yes vs. no), and study protocol completed at baseline (Level 1 vs. Level 2).

Associations were quantified by odds ratios (ORs) and 95% confidence intervals (CIs) estimated with logistic regression models for the main analysis, and by beta coefficients and 95% CIs estimated with linear regression models for the secondary analysis. To control for possible differences during the invitation process, all models were adjusted for duration between baseline examination and follow-up invitation in years and day of the week the invitation were sent. All analyses were performed using R version 3.4.3 (


Out of 3058 participants included in this analysis 2226 completed the follow-up questionnaire, equaling a response proportion of 72.8% (see Table 1 for a descriptive analysis of main variables used in the study). Female persons were more frequently included in the study sample, as were persons with German nationality, reflecting that these groups are more likely to participate and therefore completed baseline recruitment earlier [17]. Elder persons were also more frequently included because the upper two age strata (50–59 and ≥ 60) were oversampled according to study design [13].

Table 1 Descriptive analysis of the main variables used in the study

Our analysis (Table 2) did not reveal evidence that adding a flyer to the invitation influenced the likelihood of participation in the follow-up questionnaire (OR 0.94, 95% CI: 0.80, 1.11). Female subjects were more likely to participate (OR 1.48, 95% CI: 1.26, 1.75), as were subjects in the highest age group (≥ 60) compared to subject aged 40–49 (OR 1.95, 95% CI: 1.50, 2.53). Persons with non-German nationality were less likely to take part (OR 0.33, 95% CI: 0.25, 0.44). Subjects who needed to be reminded before eventually participating in the baseline examination were less likely to complete the follow-up questionnaire (1 reminder: OR 0.68, 95% CI: 0.55, 0.84; 2 reminders: OR 0.59, 95% CI: 0.45, 0.77; ≥3 reminders: OR 0.39, 95% CI: 0.29, 0.53). Subjects who scheduled three or more appointments during baseline were less likely to participate compared to subjects completing examinations at the first scheduled appointment (OR 0.67, 95% CI: 0.49, 0.91). If the phone number of a person could be retrieved in public phone records before baseline recruitment, they were less likely to participate in the follow-up (OR 0.73, 95% CI: 0.57, 0.93).

Table 2 Odds ratios (95% CIs) for participating in the follow-up and β–coefficients (95% CIs) for time to response (in days) as estimated from regression models adjusted for duration between baseline examination and follow-up invitation in years and day of the week the invitation were sent

Adding a flyer to the invitation was also not associated to the time to response in days (β̂ = 1.71, 95% CI: − 1.01, 4.44; see Table 2). Persons with non-German nationality took more time to send back the questionnaire (β̂ = 6.66, 95% CI: 0.36, 12.96). If a person needed to be reminded before eventually participating in the baseline examination, it took them also longer to return the completed questionnaire (1 reminder: β̂ = 7.80, 95% CI: 4.44, 11.16; 2 reminders: β̂ = 5.94, 95% CI: 1.27, 10.61; ≥3 reminders: β̂ = 18.52, 95% CI: 12.47, 24.58). Person whose phone numbers had been available before baseline recruitment also responded later β̂ = 5.21, 95% CI: 1.39, 9.03).


The current trial did not provide evidence that an informational flyer, intended as motivation, influenced the likelihood of participation or the time it took participants to return the filled questionnaire. It is important to note that this result only relates to the effectiveness of this particular flyer and especially does not rule out that, with a multitude of possible variations in content and design, other flyers could be more successful in increasing response. Nevertheless this study is a good example on how small evaluation trials can be used to assess the effectiveness of new recruitment measures and information materials, thereby providing information to improve the efficient allocation of resources. For instance, this finding will inform our decision should the situation arise that adding the flyer would raise letters into the next postage tier.

It should raise more concerns, however, that the intensity of recruitment during baseline was associated with participation in the follow-up and that in a negative way. Persons who had to be reminded more often during baseline, or needed more appointments until eventually attending the examination completed the follow-up questionnaire less often and did so more slowly. Hence, the long term benefits of additional efforts to increase response during baseline appear to be limited. And those who responded more slowly at follow-up again caused additional recruitment effort as compared to early respondents, because they received additional reminders and/or needed to be called-up more often according to the recruitment protocol. This finding was corroborated by evidence suggesting that the availability of phone numbers prior to baseline recruitment was associated with a lower likelihood of participation and a slower response. It is known that recruitment by phone results in higher response proportions as compared to sole postal recruitment [17, 18], that is, some persons got convinced to participate at baseline during the phone call that otherwise would have not, and these persons were more reluctant to participate in the follow-up. Note that recruitment by phone is also more time-consuming for the field staff as compared to postal recruitment.

Although our results suggest that additional efforts and resources spent on recruitment during baseline were punished at the follow-up, these findings need careful interpretation. First it is important to note that the relation between recruitment effort at baseline and participation at follow-up is not causal, but rather both variables are dependent on a common cause, that is, traits in the particular individual to be recruited. Consequently, these findings should not lead to decide against more intense recruitments during baseline in order to avoid recruiting participants that are more likely to drop out later on. Not only is low participation a problem already, this strategy might introduce bias by systematically missing out on a sub-population that potentially is different from other participants [9, 10, 19,20,21]. On the contrary, these results suggest that information about baseline recruitment (i.e., paradata) might be useful to tailor follow-up recruitments to those who were difficult to recruit during baseline, by, for instance, scheduling more and earlier calls for them, offering them better incentives, or, if scientifically justiciable, provide them with less arduous questionnaires (i.e., short forms) [22].

To this end, however, it is necessary to routinely and meticulously collect paradata during recruitment, which not only depends on suitable software, but also on motivated field staff actually taking advantage of the possibilities offered by such software, as this task requires extra effort and diligence. But with such paradata at hand, it is possible to routinely evaluate recruitment measures [14] or utilize responsive recruitment protocols that can reduce non-response bias or increase response by adapting to special sub-populations or to conditions encountered in the field [22].

Motivating individuals to enroll in cohort studies and stay enrolled thereafter is one of the main challenges for population based research [5]. A considerable amount of research focused on low-level technical results of the delivery of invitations and the use of material incentives [6], but there is also research indicating that some participants are motivated by non-material reasons. In addition to their desire to learn more about one’s own health status and receiving personal medical advices, people state as their reasons for enrollment their support for scientific progress, the prospect of gaining insights into research practice, and their trust in the institutions that conduct the research [23, 24]. The use of informational flyers is one reasonable way to convey messages on scientific progress, insights into research practice, and an image of trustworthiness and the flyer evaluated in this trial contained such information. Our results, however, suggest that it did so in an ineffective way and indicate that further research in this area is warranted.

Known limitations of this study include that the trial was only conducted in one of the 18 study centers across Germany due to logistic reasons. Furthermore, the participants included in this study do not constitute a true random sample, because, in order not to disrupt the standardized recruitment protocol for the follow-up questionnaire, they had to be included in order of their invitation. And, as mentioned before, conclusions concerning the effectiveness of informational materials for recruitment are limited to the particular flyer under investigation, which also was designed from the perspective of public relation experts, rather than based on a scientific theory.


Assessing the effectiveness of new measures and materials utilized for recruitment can provide information to efficiently allocate resources. Paradata collected during baseline recruitment for a cohort study can help identifying subgroups that are less likely to participate in follow-up examinations and therefore could be used to tailor follow-up recruitment protocols accordingly.

Availability of data and materials

Data analyzed for the current study are not publicly available due to privacy concerns, but will be made available upon reasonable request.



German National Cohort


  1. 1.

    Galea S, Tracy M. Participation rates in epidemiologic studies. Ann Epidemiol. 2007;17:643–53.

    Article  PubMed  Google Scholar 

  2. 2.

    Groves RM. Nonresponse rates and nonresponse Bias in household surveys. Public Opinion Quarterly. 2006;70(5):646–75.

    Article  Google Scholar 

  3. 3.

    Morton LM, Cahill J, Hartge P. Reporting participation in epidemiologic studies: a survey of practice. Am J Epidemiol. 2006;163(3):197–203.

    Article  PubMed  Google Scholar 

  4. 4.

    Stang A. Nonresponse research - an underdeveloped field in epidemiology. Eur J Epidemiol. 2003;18:929–31.

    Article  Google Scholar 

  5. 5.

    Lacey JV Jr, Savage KE. 50% response rates: half-empty, or half-full? Cancer Causes Control. 2016;27(6):805–8.

    Article  PubMed  Google Scholar 

  6. 6.

    Edwards PJ, Roberts I, Clarke MJ, Diguiseppi C, Wentz R, Kwan I, et al. Methods to increase response to postal and electronic questionnaires. Cochrane Database Syst Rev. 2009;3:MR000008.

    Article  Google Scholar 

  7. 7.

    German National Cohort C. The German National Cohort: aims, study design and organization. Eur J Epidemiol. 2014;29(5):371–82.

    Article  Google Scholar 

  8. 8.

    Lynn P. Methods for longitudinal surveys. In: Lynn P, editor. Methodology of longitudinal surveys. Hoboken: Wiley; 2009. p. 1–20.

    Google Scholar 

  9. 9.

    Cohen SB, Machlin SR, Branscome JM. Patterns of survey attrition and reluctant response in the 1996 medical expenditure panel survey. Health Services Outcomes Res Methodol. 2000;1(2):131–48.

    Article  Google Scholar 

  10. 10.

    Haring R, Alte D, Volzke H, Sauer S, Wallaschofski H, John U, et al. Extended recruitment efforts minimize attrition but not necessarily bias. J Clin Epidemiol. 2009;62(3):252–60.

    Article  PubMed  Google Scholar 

  11. 11.

    Nederhof E, Jörg F, Raven D, Veenstra R, Verhulst FC, Ormel J, et al. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study. BMC Med Res Methodol. 2012;93:3–15.

    Google Scholar 

  12. 12.

    Groves RM, Heeringa SG. Responsive Design for Household Surveys: tools for actively controlling survey errors and costs. J Royal Statistical Soc Series A. 2006;169(3):439–57.

    Article  Google Scholar 

  13. 13.

    Schipf S, Schöne G, Schmidt B, Günther K, Stübs G, Greiser KH, et al. The baseline assessment of the German National Cohort (NAKO Gesundheitsstudie): participation in the examination modules, quality assurance, and the use of secondary data. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz. 2020;63(3):254–66.

    Article  PubMed  Google Scholar 

  14. 14.

    Langeheine M, Pohlabeln H, Ahrens W, Günther K, Rach S. Study invitations with envelopes made from recycled paper do not increase likelihood of active responses or study participation in the German National Cohort. BMC Res Notes. 2019;12(1):468.

    Article  PubMed  PubMed Central  Google Scholar 

  15. 15.

    Reineke A, Pigeot I, Ahrens W, Rach S. MODYS – a modular control and documentation system for epidemiological studies. In: Bammann K, Lissner L, Pigeot I, Ahrens W, editors. Instruments for health surveys in children and adolescents. Cham: Springer Nature Switzerland; 2018. p. 25–45.

    Google Scholar 

  16. 16.

    Langeheine M, Pohlabeln H, Ahrens W, Rach S. Consequences of an extended recruitment on participation in the follow-up of a child study: results from the German IDEFICS Cohort. Paediatr Perinat Epidemiol. 2017;31(1):76–86.

    Article  PubMed  Google Scholar 

  17. 17.

    Winkler V, Leitzmann M, Obi N, Ahrens W, Edinger T, Giani G, et al. Response in individuals with and without foreign background and application to the National Cohort in Germany: which factors have an effect? Int J Public Health. 2014;59(3):555–63.

    Article  PubMed  Google Scholar 

  18. 18.

    Stang A, Moebus S, Dragano N, Beck EM, Mohlenkamp S, Schmermund A, et al. Baseline recruitment and analyses of nonresponse of the Heinz Nixdorf recall study: identifiability of phone numbers as the major determinant of response. Eur J Epidemiol. 2005;20(6):489–96.

    CAS  Article  Google Scholar 

  19. 19.

    Hall J, Brown V, Nicolaas G, Lynn P. Extended field efforts to reduce the risk of non-response Bias: have the effects changed over time? Can weighting achieve the same effects? Bulletin de Méthodologie Sociologique. 2013;117(5):5–25.

    Article  Google Scholar 

  20. 20.

    Maclennan B, Kypri K, Langley J, Room R. Non-response bias in a community survey of drinking, alcohol-related experiences and public opinion on alcohol policy. Drug Alcohol Depend. 2012;126(1–2):189–94.

    Article  PubMed  Google Scholar 

  21. 21.

    Studer J, Baggio S, Mohler-Kuo M, Dermota P, Gaume J, Bertholet N, et al. Examining non-response Bias in substance use research. Are late respondents proxies for non-respondents? Drug Alcohol Depend. 2013;132(1–2):316–23.

    Article  Google Scholar 

  22. 22.

    Tourangeau R, Michael Brick J, Lohr S, Li J. Adaptive and responsive survey designs: a review and assessment. J Royal Statistical Soc: Series A (Statistics in Society). 2017;180(1):203–23.

    Article  Google Scholar 

  23. 23.

    Nobile H, Borry P, Pischon T, Steinbrecher A, Boeing H, Vigl M, et al. Participants' decision to enroll in cohort studies with biobanks: quantitative insights from two German studies. Per Med. 2017;14(6):477–85.

    CAS  Article  PubMed  Google Scholar 

  24. 24.

    Nobile H, Bergmann MM, Moldenhauer J, Borry P. Participants' accounts on their decision to join a Cohort study with an attached biobank: a qualitative content analysis study within two German studies. J Empir Res Hum Res Ethics. 2016;11(3):237–49.

    Article  PubMed  Google Scholar 

Download references


We thank all participants who took part in the GNC study and the staff in this research program. The authors gratefully acknowledge the work of the interdisciplinary team of computer scientists, information specialists, and study nurses that develops the MODYS software.


This project was conducted with data from the German National Cohort (GNC) ( The GNC is funded by the Federal Ministry of Education and Research [BMBF, project funding reference numbers: 01ER1301A/B/C and 01ER1511D]; federal states; the Helmholtz Association and additional financial support by the participating universities and the institutes of the Leibniz Association. The funding bodies had no role in the design of the study, in the collection, analysis, and interpretation of data, and in writing the manuscript. The publication of this article was partly funded by the Open Access Fund of the Leibniz Association.

Author information




BH, KG and SR designed the study, BH and SR conducted the study and analyzed the data; SR wrote the manuscript, BH, KG and SR edited and approved the manuscript and were responsible for quality assurance and control. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Stefan Rach.

Ethics declarations

Ethics approval and consent to participate

All procedures performed in the GNC were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The study was approved by the Ethics committee of the local chamber of physicians in Bremen (Bremen Medical Association, reference number RE/HR-388). Written informed consent was obtained from all participants included in the GNC.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Rach, S., Günther, K. & Hadeler, B. Participants who were difficult to recruit at baseline are less likely to complete a follow-up questionnaire – results from the German National Cohort. BMC Med Res Methodol 20, 187 (2020).

Download citation


  • Epidemiologic methods
  • Surveys and questionnaires
  • Lost to follow-up
  • Follow-up studies
  • German National Cohort