Skip to main content

The effect of personalised versus non-personalised study invitations on recruitment within the ENGAGE feasibility trial: an embedded randomised controlled recruitment trial

Abstract

Background

Recruitment into clinical trials is challenging and there is a lack of evidence on effective recruitment strategies. Personalisation of invitation letters is a potentially pragmatic and feasible way of increasing recruitment rates at a low-cost. However, there is a lack of evidence concerning the effect of personalising of study invitation letters on recruitment rates.

Methods

We undertook a Study Within A Trial (SWAT) to investigate the effect of personalised versus non-personalised study invitation letters on recruitment rates into the host feasibility trial ENGAGE, a feasibility study of an internet-administered, guided, Low Intensity Cognitive-Behavioural Therapy based self-help intervention for parents of children previously treated for cancer. An intervention group (n = 254) received a personalised study invitation letter and the control group (n = 255) received a non-personalised study invitation letter. The primary outcome was the proportion of participants in the intervention group and the control group enrolled into the ENGAGE host feasibility trial. Secondary outcomes relating to the recruitment and screening process, and retention were examined. Differences in proportions between groups for the primary and secondary outcomes were estimated using logistic regression.

Results

Of the 509 potential participants, 56 (11.0%) were enrolled into the ENGAGE host feasibility trial: personalised: 30/254 (11.8%) and non-personalised: 26/255 (10.2%). No statistically significant effect on personalisation of enrolment was found (OR 1.18, 95% CI 0.68–2.06). No statistically significant differences were found for any secondary outcome.

Conclusions

Personalisation of study invitations had no effect on recruitment. However, given the small study sample size in the present SWAT, and lack of similar embedded recruitment RCTs to enable a meta-analysis, additional SWATs to examine the personalisation of study invitation letters are warranted.

Trial registration

ISRCTN57233429; ISRCTN18404129; SWAT 112, Northern Ireland Hub for Trials Methodology Research SWAT repository (2018 OCT 1 1231) (https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/FileStore/Filetoupload,939618,en.pdf).

Peer Review reports

Background

Randomised controlled trials (RCTs) are generally considered the gold standard for evaluating healthcare interventions, but often face challenges with the recruitment [1, 2] and retention [3] of participants. Extended recruitment periods, failure to reach recruitment targets [2, 4], and poor retention [3] are common, resulting in poor research quality and monetary loss [2, 4]. Low recruitment and retention rates also lead to insufficient statistical power, increasing the risk of either type I (a false positive finding) or type II (a false negative finding) errors [5]. There are also ethical considerations with participants investing time and energy in trials that might not generate results that can adequately answer the research question [4]. Highlighting these difficulties, recent reviews of trials conducted in the United Kingdom between 2002 and 2016 found only 55–56% reached their original recruitment target [2, 3]. Given these challenges, the need for trial methodology research to improve trial process efficiency is clear [6, 7]. The conduct of studies within a trial (SWATs) (i.e., a study embedded within a host trial aimed to evaluate trial processes, such as recruitment and retention) is a way to establish such an evidence base, and hopefully lead to reduced research waste [7, 8]. Emphasising the importance of research to improve trial process efficiency, multiple initiatives for prioritising research to evaluate trial processes have been undertaken, such as the Medical Research Council funded Techniques for Assisting Recruitment to Trials (MRC-START) programme [9]; Prioritising Recruitment in Randomised Trials (PRioRiTy)-study [10]; the Trial Forge Platform [11]; and the Online Resource for Research in Clinical triAls (ORRCA) project [12].

Despite trial recruitment and retention being common challenges, recent Cochrane reviews have identified a lack of evidence concerning strategies to improve recruitment [4] and retention [13]. Previous research has shown that adopting an open trial design (e.g., participants know which intervention they will receive), and telephone reminders during the recruitment phase increase recruitment rates [4]. However, other strategies have produced varying effects. For example, a recent meta-analysis investigating the effect of user-tested, simplified, and clarified study information sheets on recruitment rates showed no effect [14]. Monetary incentives were found to be effective in one trial [15], whereas access to video clips with study information online [16, 17] was not. However, currently there are too few studies examining each strategy for conclusions to be drawn and our understanding of how to recruit effectively to trials is limited [4, 18]. One potential way of optimising recruitment is the personalisation of trial documentation, with a systematic review suggesting personalisation can improve questionnaire return rates [19]. However, this review included a wide variety of personalisation strategies, for example, signing letters personally and hand-addressing envelopes making it difficult to know which personalisation strategy/ies may have a positive effect on recruitment [20]. Further, the current literature has predominantly focused on returning questionnaires or surveys [21,22,23] and very few studies utilising a RCT design, have been conducted within the context of healthcare intervention research, to examine the personalisation of study invitations on recruitment rates [24]. Indeed, in the latest Cochrane systematic reviews of strategies to improve recruitment [4] and retention [13] into RCTs, personalisation of study invitations was not included. Further, to the best of our knowledge, no RCT has examined the personalisation of study invitations in the context of mental health research which is of particular importance given recruitment and retention to mental health trials has been identified as particularly challenging [18, 25].

The ENGAGE host feasibility trial

Childhood cancer is a leading cause of death and disease burden among children, and their parents, often their primary source of support, are actively involved in the child’s care even years after treatment. Sub-groups of parents report mental health difficulties [26, 27] productivity losses [28], daily life restrictions, and an unmet need for psychological support [29] after end of treatment. However, there remains a lack of evidence-based interventions tailored to parents, with their needs commonly unmet. Additionally, parents report barriers to seeking support such as lack of time, guilt, and putting the child’s health first [30, 31]. In accordance with the Medical Research Council complex interventions framework [32, 33] we have conducted a series of studies informing the development of an internet-administered, guided, self-help programme, EJDeR (Swedish acronym). EJDeR was co-created with parent research partners and is based on low intensity cognitive behavioural therapy (LICBT) for parents of children treated for cancer [34]. Studies have included reviewing existing evidence [26], exploring negative and positive experiences [35], conceptualizing distress [36], participatory action research [37], a cross-sectional survey [38], and professional and public involvement [34]. Dependent on the parents’ main presenting difficulties, LICBT behavioral activation or worry management treatment protocols are used for the treatment of depression and generalised anxiety disorder. EJDeR is delivered via the U-CARE-portal (Portal), a web-based platform designed to deliver internet-administered LICBT interventions and support research. EJDeR is guided by e-therapists, with parents receiving an initial assessment via videoconferencing or telephone, weekly written messages via the Portal, and a mid-intervention booster session via videoconferencing or telephone. Participants are located across Sweden. EJDeR is designed to be accessed from computers and mobile devices, and participants can choose where to use it. All research activities in the ENGAGE host feasibility trial were carried out via the Portal, e-mail, or telephone by staff located at the Department of Women’s and Children’s Health, Uppsala University, Sweden. We have tested EJDeR and intended study procedures for a planned future RCT of the clinical efficacy and cost-effectiveness of EJDeR in the ENGAGE host feasibility trial [39] (ISRCTN 57233429).

Aims and objectives

This study aimed to use a SWAT, embedded within the ENGAGE host feasibility trial, with the primary objective to investigate the effect of personalised versus non-personalised study invitation letters on recruitment rates. We further aimed to explore a number of secondary objectives, investigating the effect of personalised versus non-personalised study invitation letters on a number of secondary outcomes related to the recruitment and retention.

Methods

This SWAT is reported in accordance with guidelines for reporting embedded recruitment trials [40]. The SWAT is registered in the ISRCTN registry (ISRCTN18404129) and the Northern Ireland Hub for Trials Methodology Research SWAT repository (SWAT 112). A full protocol for the SWAT has been published [24].

Design

A parallel group embedded RCT with a 1:1 allocation ratio to investigate the effect of personalised compared with non-personalised study invitation letters on recruitment rates [24].

Participants

Participants eligible for inclusion in the ENGAGE host feasibility trial were parents of children diagnosed with cancer during childhood (0–18 years) who completed cancer treatment 3 months to 5 years previously and had a self-reported need for psychological support. The full eligibility criteria are outlined in the ENGAGE host feasibility trial study protocol [39]. All potential participants who were invited via the Childhood Cancer Registry into the ENGAGE host feasibility trial were eligible for the SWAT.

Recruitment

The ENGAGE host feasibility trial adopted two recruitment strategies: postal study invitation packs via the Swedish Childhood Cancer Registry (National Quality Registry) and advertisements on social media and patient organisation websites. Only participants recruited via the Childhood Cancer Registry were included in the SWAT. Children’s personal identification numbers were gathered from the Childhood Cancer Registry, and subsequently matched with parents’ names and addresses via the Swedish Tax Agency’s registry NAVET. Study invitation packs were sent in blocks of 100, every 30 days, until the target sample size of 50 participants was reached. Invitation packs included a study invitation letter, a study information sheet, a link to the study website (the Portal) and a reply slip with a stamped addressed envelope. Contact details to the research team were provided and parents were able to opt-out from further contact with the research team via post, telephone, e-mail, or the Portal. Opt-out forms included a ‘reasons for non-participation’ questionnaire. The use of an opt-out recruitment strategy was approved by the Regional Ethical Review Board in Uppsala, Sweden (Dnr: 2017/527). As parents were individually invited into the study, there was a possibility for two parents of the same child to be invited and enrolled into the trial.

Interventions

Potential participants to the ENGAGE host feasibility trial were randomised to be invited via: a personalised study invitation letter, including name and address of the parent (intervention group) or a non-personalised study invitation letter not including name and address of the parent (control group). Invitations did not differ in any other aspect and translated versions can be found as a supplement to this paper, see Additional files 1 and 2.

Study procedures

Potential participants could access study information, in text and video format, and provide consent via the Portal. Potential participants who wished to opt-out of the study could do so by completing an opt-out form and reasons for non-participation questionnaire via the Portal or by paper via post. Participants could also opt-out by telephoning or e-mailing the research team. Potential participants who wanted additional information from the research team before providing consent via the Portal could register interest via a postal reply slip included in the invitation pack, or by telephoning, or e-mailing the research team. Those registering interest were provided with study information by the research team and asked to provide consent via the Portal if interested in study participation. The eligibility interview (including the Mini-International Neuropsychiatric Interview (M.I.N.I. [41]), semi-structured interview at baseline, semi-structured interview at post-treatment (12 weeks), M.I.N.I. at post-treatment (12 weeks) and M.I.N.I. at 6-month follow-up was conducted over the telephone. Online Portal assessments at baseline, post-treatment, and 6-month follow-up were done over the Portal, or, if preferred by the participant, over telephone with a member of the research team. Reminders to complete online Portal assessments were provided if participants did not complete online Portal assessments within 2 weeks of gaining access. Participants who dropped out of the study were asked to provide a reason, however they were reminded that they did not need to report a reason if they preferred not to.

Outcomes

The primary outcome was the proportion of participants in the intervention group and the control group enrolled into the ENGAGE host feasibility trial. Secondary outcomes were the proportion of potential participants invited into the study in each group that:

  • Registered interest in participating in the ENGAGE host feasibility trial

  • Opted out of the ENGAGE host feasibility trial

  • Completed reasons for non-participation questionnaire in the ENGAGE host feasibility trial

  • Consented to participate in the ENGAGE host feasibility trial

  • Completed the eligibility interview for inclusion in the ENGAGE host feasibility trial

  • Completed the semi-structured interview at baseline in the ENGAGE host feasibility trial

  • Completed the online Portal assessment at baseline in the ENGAGE host feasibility trial

  • Were retained at post-treatment (12 weeks) and 6-month follow-up in the ENGAGE host feasibility trial respectively i.e. a) completed the M.I.N.I. at post-treatment (12 weeks), b) completed the semi-structured interview at post-treatment (12 weeks), c) completed the online Portal assessment at post-treatment (12 weeks), d) completed the M.I.N.I. at 6-month follow-up, e) completed the online Portal assessment at 6-month follow-up

  • Required a telephone reminder at baseline, post-treatment, and 6-month follow-up respectively in the ENGAGE host feasibility trial to complete the online Portal assessment

Protocol changes

The secondary outcome “consented to participate in the ENGAGE host feasibility trial” was added after the publication of the study protocol [24], but prior to statistical analysis of the data presented herein. The outcome “completed the semi-structured interview at baseline in the ENGAGE host feasibility trial” was added when the statistical analysis had commenced. In the SWAT protocol [24], retention outcomes were collapsed to include completion of all post-treatment (12 weeks) and 6-month follow-up assessments respectively. Due to different retention rates for different assessments, outcomes are reported separately. Online Portal assessments consist of clinical outcomes included in the ENGAGE host feasibility trial [39].

Data collection

Study data collected outside of the Portal was entered onto paper-based case report forms and subsequently manually entered independently by two research assistants onto a Microsoft® Access database with data exported into Microsoft® Excel spreadsheets. Portal data was extracted by an in-house system developer and exported to Microsoft® Excel spreadsheets, with data prepared independently by two research assistants. Microsoft® Spreadsheet Compare was used to compare all data entries to identify discrepancies and missing values, with any discrepancies discussed and resolved in data management meetings.

Sample size

The SWAT sample size was dependent on the ENGAGE host feasibility trial and therefore no sample size calculation was made. It was anticipated that 600 invitations would be needed to reach the target sample size of 50 in the ENGAGE host feasibility trial [39] which would have provided 90% power to identify a 7.5% difference between groups in recruitment rate at a two-sided alpha = 0.05 [24]. We randomized 509 potential participants into the SWAT, however, no post-hoc power analysis was conducted given the limitations of post hoc analysis, especially when reporting negative trial results [42].

Randomisation

Eligible participants were randomised in a 1:1 ratio to the intervention group (personalised study invitation letter) or control group (non-personalised study invitation letter) using simple randomisation without stratification. To ensure allocation concealment, a member of the Portal development team, not involved in participant recruitment, produced a computer-generated randomised sequence outside of the Portal. The randomisation software was developed in C# and written specifically for randomisation into the SWAT and was designed to read a de-identified text file-list of potential participants and output the participants in two randomised groups into a CSV file. The participant allocation list was returned to the research team to implement. Participants were allocated a Recruitment ID number within the study invitation pack dependent on SWAT intervention allocation. Participants entered this Recruitment ID number when providing online consent, or opting out of the study, on the Portal. In addition, an allocation list with the participants’ personal identification number was stored on a secure USB in a locked filing cabinet. Only research staff members responsible for preparing and sending the invitation packs had access to the allocation list. To assure adherence to the randomisation sequence, a random sample of 10% of every 50 invitation letters to the ENGAGE host feasibility trial were checked for accuracy.

Eligible participants were not informed about the SWAT, and were therefore blind to the SWAT hypothesis and unaware they were participating in an embedded recruitment trial. It was not possible for research team members involved in sending study invitations, or working with recruitment to be blind to group allocation. However, the researcher conducting the statistical analysis (third author MÖ) was blind to group allocation. Additionally, each outcome variable name was allocated a letter (aaa-sss) and their order of presentation randomised in the dataset provided for statistical analysis, to further ensure blinding.

E-therapists who guided the EJDeR intervention were blind to group allocation in the SWAT, with the exception of one e-therapist (to five participants) who was also a member of the research team and thus had access to the information about SWAT group allocation.

Statistical analysis

Statistical analyses were conducted on an intention-to-treat basis. A two-sided p value of < 0.05 was chosen to indicate statistical significance. A decision was made to use Stata (Stata/MP 16.1, StataCorp) instead of SPSS as stated in the study protocol, [24], as preferred by the researcher conducting all analyses (MÖ). Categorical outcomes were reported with numbers and percentages. Differences in proportions between groups for the primary and secondary outcomes were estimated using logistic regression, with the result reported as an odds ratio with 95% confidence interval and p-values. If two parents of the same child were enrolled in the ENGAGE host feasibility trial, this would cause some dependency in the data between the two parents. There were two cases whereby two parents of the same child were enrolled into the ENGAGE host feasibility trial who were invited via the Childhood Cancer Registry. In one case both parents were randomised into the intervention group (personalised study invitation letter). In the other case one parent was randomised to the intervention group and one to the control group (non-personalised study invitation letter).

In the original data analysis plan, Logistic regression models would include stratification by parent and child gender (male/female). However, due to ethical considerations, we were unable to use information on gender unless this data was reported to the research team (e.g., during eligibility interviews on or when opting out of the study). Subsequently, there was too little data on gender to stratify the analyses on gender.

Public involvement

Procedures for the ENGAGE host feasibility trial were developed in collaboration with a parent research partner group consisting of two fathers and two mothers, aged 45–54, with lived experience of being a parent to a child treated for cancer. For the SWAT, parent research partners provided feedback on general wording of the invitation letters, and how to personalise the letter provided to the intervention group. Parent research partners were asked about preferences regarding personalising e.g., to include the child’s name along with the parent’s, or to only use the parent’s name. The group preferred to only include the parent’s name and advised that including the child’s name may potentially be considered an invasion of privacy [24].

Results

Recruitment

Recruitment via study invitation letters to the ENGAGE host feasibility trial took place between July 3rd and November 30th 2020. The recruitment target was met after 509 study invitations were sent. Post-treatment (12 weeks) data collection took place between September 22nd 2020 and April 8th 2021, and 6-month follow-up data collection between April 18th and October 4th 2021. See Fig. 1 for participant flow.

Fig. 1
figure 1

Study flow of study within a trial (SWAT) participants in the ENGAGE host feasibility trial. Note. Solid black lines denote participant flow through the study, including study drop outs i.e., those who discontinued the study. Dashed grey lines represent participants that were lost to follow-up during assessments at post-treatment (12 week) and 6-month follow-up respectively, but had not dropped out of the study

Outcomes

Of the 509 potential participants invited, 56 (11.0%) were enrolled into the ENGAGE host feasibility trial: personalised: 30/254 (11.8%) and non-personalised: 26/255 (10.2%). No statistically significant effects on personalisation of enrolment were found (OR 1.18, 95% CI 0.68–2.06). No significant effects on personalisation were found for any of the secondary outcomes, see Table 1.

Table 1 Descriptive summaries and odds ratios for primary and secondary outcomes

Discussion

Summary

The primary objective was to investigate the effect of personalised versus non-personalised study invitation letters on recruitment rates, i.e., rates of enrolment into a host trial examining the feasibility of the internet administered, guided, self-help programme, EJDeR, for parents of children treated for cancer. Personalisation of study invitations had no effect on enrolment in the host trial or any of the secondary outcomes. Numbers were larger in the intervention group (personalised study invitation letters: 30/254 [11.8%]) versus the control group (non-personalised study invitation letters: 26/255 [10.2%]) for rate of enrolment and the majority of secondary outcomes related to the recruitment and screening process i.e. consented to participate, and completed the eligibility interview, semi-structured interview, and online Portal assessments at baseline. The numbers for opting out of the study were smaller in the intervention group (81/254 [31.9%] than the control group (83/255 [32.6%]) and for registered interest in participating the numbers were the same in the intervention group (14/254 [5.5%] and the control group (14/255 [5.5%]. For outcomes related to retention the opposite trend is visible. At post-treatment (12 weeks) numbers completing were smaller in the intervention group (M.I.N.I. and semi-structured interview: 18/254 [7.1%]; online Portal assessment: 13/254 [5.1%]) than in the control group (M.I.N.I. and semi-structured interview: 19/255 [7.5%]; online Portal assessment: 15/255 [5.9%]). At 6 months follow-up numbers completing were also smaller in the intervention group (M.I.N.I.: 17/254 [6.7%]; online Portal assessment: 15/254 [5.9%]) than the control group (M.I.N.I.: 19/255 [7.5%]; online Portal assessment: 18/255 [7.1%]). However, given the wide confidence intervals for all primary and secondary outcomes relating to recruitment and retention, findings should be interpreted with caution.

Limitations

First, although the ENGAGE host feasibility trial recruited to target, as a feasibility study, only a small number of potential participants were invited and subsequently recruited and retained. As such, the embedded recruitment trial may be underpowered to detect between group differences. Future embedded recruitment trials, within large-scale evaluation RCTs, are warranted to further investigate the effect of personalised versus non-personalised study invitation letters on recruitment and retention rates. Given the current lack of similar RCTs, a cumulative meta-analysis is not possible and further justifies the need to conduct further research [43]. Second, both the personalised and non-personalised study invitation letters contained elements that may be perceived as personalised. For example, names and addresses on envelopes for both groups were written by hand, and all invitation letters were signed by the principal investigator and a parent research partner. This could have lessened the effect of the intervention. Indeed, some evidence suggests handwriting the address on envelopes increases survey response rates [19]. In addition, it was not possible for the research team to know which given name potential participants used and subsequently middle names were included when personalising study invitation letter. Using both first and middle names could be perceived as less personal, further reducing the impact of the intervention. Third, stratification on gender in the logistic regression model was not possible as we could only include data on gender when reported to the research team (e.g., during eligibility interviews and when opting out of the study). However, our previous longitudinal research with the population has not found any significant differences between parents who participated in assessment completion at various time points, versus those who did not complete assessments, in relation to gender (parent and child) [44, 45]. Therefore, we consider not being able to stratify on gender in the logistic regression model unlikely to have impacted our results. Future studies may wish to seek ethical approval to report certain sociodemographic characteristics, where possible, for all participants approached and randomised into a SWAT in accordance with the guidelines for reporting embedded recruitment trials [40]. This would facilitate an examination of potential differences between participants and non-participants on certain demographic factors, such as gender, and enable more extensive analysis in the future, as well as providing important information concerning the generalisability of the SWAT results. In addition, we did not plan to report baseline characteristics, presented by SWAT group allocation, of those enrolled into the ENGAGE host feasibility trial, which would have provided further important contextual information. Finally, we did not apply for ethical approval to report how often two parents of the same child were randomised into the SWAT and on these occasions whether parents were allocated to different intervention groups. Of those parents enrolled into the ENGAGE host feasibility trial, in only one case were two parents of the same child allocated to different intervention groups. Therefore, the number of times this happened out of all parents randomised into the SWAT is considered likely to be small. It is also considered unlikely parents would be unblinded to the SWAT hypothesis, however future similar SWATs should look to implement processes to prevent two potential participants in the same household being allocated to different SWAT intervention groups.

Strengths and interpretation of the findings in the context of the wider literature

Despite the aforementioned limitations, this study has important strengths. Research on recruitment strategies to clinical trials has been identified as much needed to increase the quality of clinical research and thus reduce research waste [7,8,9,10,11,12] and this study adds to the emerging body of evidence on the subject. We investigated the effect of personalised versus non-personalised study invitation letters on multiple outcomes related to both recruitment and retention, which, to the best of our knowledge, has not been done before. The methodology is straightforward and easy to undertake, and this study could be used as a template for future SWATs. In future studies, we recommend the use of electronic case report forms to facilitate data collection, since the use of paper-based case report forms was time consuming, and there is evidence to suggest paper-based case report forms are more prone to data entry errors, such as data omissions [46].

Current literature on effects of personalisation of study invitation letters on recruitment and retention is limited. The personalisation of study invitation letters has been found to have a positive effect on survey response rates [21,22,23]. However, our results are in line with a recent embedded recruitment trial that found a non-significant positive effect for the personalisation of study invitation letters on the recruitment of general practitioners [47]. Importantly, even small improvements in recruitment rates could be of benefit for clinical trial recruitment, especially considering the personalisation of study invitation letters is a pragmatic, feasible, and low-cost strategy. Another interesting finding in the current study was that, even if not statistically significant, data indicates that less participants in the personalised study invitation group were retained at follow-up e.g., completed assessments post-treatment (12 weeks) and at 6-months follow-up. Two recent studies have shown that personalisation of text message reminders is not associated with increased return rates of follow-up questionnaires within clinical trials [48, 49], whereas a further recent study found a favorable effect of personalised reminders via text messages [50]. However, to date very few studies have investigated the effect of personalised study invitations on secondary outcomes pertaining to retention.

Conclusions

Personalisation of study invitations had little effect on recruitment, and a non-significant positive effect was found, with an enrolment rate of 11.8% (30/254) in the personalised group and 10.2% (26/255) in the non-personalised group. Given the small sample size, and lack of similar embedded recruitment RCTs, the effect of the personalisation of study invitations on recruitment and retention remains uncertain and there is a need to conduct similar SWATs within large-scale evaluation RCTs with different populations.

Availability of data and materials

The research data supporting the findings of this study is stored in Zenodo repository with the identifier doi.org/10.5281/zenodo.5796065. Access to the data stored in Zenodo is available upon written request from the corresponding author.

Abbreviations

LICBT:

Low Intensity Cognitive-Behavioural Therapy

M.I.N.I.:

Mini-International Neuropsychiatric Interview

RCT:

Randomised Controlled Trial

SWAT:

Study Within A Trial

References

  1. McDonald AM, Knight RC, Campbell MK, Entwistle VA, Grant AM, Cook JA, et al. What influences recruitment to randomised controlled trials? A review of trials funded by two UK funding agencies. Trials. 2006. https://doi.org/10.1186/1745-6215-7-9.

  2. Sully BGO, Julious SA, Nicholl J. A reinvestigation of recruitment to randomised, controlled, multicenter trials: a review of trials funded by two UK funding agencies. Trials. 2013. https://doi.org/10.1186/1745-6215-14-166.

  3. Walters SJ, Dos Anjos B, Henriques-Cadby I, Bortolami O, Flight L, Hind D, et al. Recruitment and retention of participants in randomised controlled trials: a review of trials funded and published by the United Kingdom Health Technology Assessment Programme. BMJ Open. 2017. https://doi.org/10.1136/bmjopen-2016-015276.

  4. Treweek S, Pitkethly M, Cook J, Fraser C, Mitchell E, Sullivan F, et al. Strategies to improve recruitment to randomised trials. Cochrane Database Syst Rev. 2018. https://doi.org/10.1002/14651858.MR000013.pub6.

  5. Christley RM. Power and error: increased risk of false positive results in underpowered studies. Open Epidemiol J. 2010. https://doi.org/10.2174/1874297101003010016.

  6. Campbell MK, Snowdon C, Francis D, Elbourne D, McDonald AM, Knight R, et al. Recruitment to randomised trials: strategies for trial enrollment and participation study. The STEPS study. Health Technol Assess. 2007. https://doi.org/10.3310/hta11480.

  7. Treweek S, Bevan S, Bower P, Campbell M, Christie J, Clarke M, et al. Trial forge guidance 1: what is a study within a trial (SWAT)? Trials. 2018. https://doi.org/10.1186/s13063-018-2535-5.

  8. Bower P, Brueton V, Gamble C, Treweek S, Smith CT, Young B, et al. Interventions to improve recruitment and retention in clinical trials: a survey and workshop to assess current practice and future priorities. Trials. 2014. https://doi.org/10.1186/1745-6215-15-399.

  9. Rick J, Graffy J, Knapp P, Small N, Collier DJ, Eldridge S, et al. Systematic techniques for assisting recruitment to trials (START): study protocol for embedded, randomized controlled trials. Trials. 2014. https://doi.org/10.1186/1745-6215-15-407.

  10. Healy P, Galvin S, Williamson PR, Treweek S, Whiting C, Maeso B, et al. Identifying trial recruitment uncertainties using a James Lind Alliance Priority setting partnership - the PRioRiTy (Prioritising recruitment in randomised trials) study. Trials. 2018. https://doi.org/10.1186/s13063-018-2544-4.

  11. Treweek S, Altman DG, Bower P, Campbell M, Chalmers I, Cotton S, et al. Making randomised trials more efficient: report of the first meeting to discuss the trial forge platform. Trials. 2015. https://doi.org/10.1186/s13063-015-0776-0.

  12. Kearney A, Harman NL, Rosala-Hallas A, Beecher C, Blazeby JM, Bower P, et al. Development of an online resource for recruitment research in clinical trials to organise and map current literature. Clin Trials. 2018. https://doi.org/10.1177/1740774518796156.

  13. Gillies K, Kearney A, Keenan C, Treweek S, Hudson J, Brueton VC, et al. Strategies to improve retention in randomised trials. Cochrane Database Syst Rev. 2021. https://doi.org/10.1002/14651858.MR000032.pub3.

  14. Madurasinghe VW, Bower P, Eldridge S, Collier D, Graffy J, Treweek S, et al. Can we achieve better recruitment by providing better information? Meta-analysis of 'studies within a trial' (SWATs) of optimised participant information sheets. BMC Med. 2021. https://doi.org/10.1186/s12916-021-02086-2.

  15. Jennings CG, MacDonald TM, Wei L, Brown MJ, McConnachie L, Mackenzie IS. Does offering an incentive payment improve recruitment to clinical trials and increase the proportion of socially deprived and elderly participants? Trials. 2015. https://doi.org/10.1186/s13063-015-0582-8.

  16. Jolly K, Sidhu M, Bower P, Madurasinghe V. Improving recruitment to a study of telehealth management for COPD: a cluster randomised controlled 'study within a trial' (SWAT) of a multimedia information resource. Trials. 2019. https://doi.org/10.1186/s13063-019-3496-z.

  17. Mattock HC, Ryan R, O'Farrelly C, Babalis D, Ramchandani PG. Does a video clip enhance recruitment into a parenting trial? Learnings from a study within a trial. Trials. 2020. https://doi.org/10.1186/s13063-020-04779-0.

  18. Liu Y, Pencheon E, Hunter RM, Moncrieff J, Freemantle N. Recruitment and retention strategies in mental health trials - a systematic review. PLoS One. 2018. https://doi.org/10.1371/journal.pone.0203127.

  19. Edwards PJ, Roberts I, Clarke MJ, Diguiseppi C, Wentz R, Kwan I, et al. Methods to increase response to postal and electronic questionnaires. Cochrane Database Syst Rev. 2009. https://doi.org/10.1002/14651858.MR000008.pub4.

  20. McCaffery J, Mitchell A, Fairhurst C, Cockayne S, Rodgers S, Relton C, et al. Does handwriting the name of a potential trial participant on an invitation letter improve recruitment rates? A randomised controlled study within a trial. F1000Res. 2019. https://doi.org/10.12688/f1000research.18939.1.

  21. Dillman DA, Lesser V, Mason R, Carlson J, Willits F, Robertson R, et al. Personalization of mail surveys for general public and populations with a group identity: results from nine studies. Rural Sociol. 2007. https://doi.org/10.1526/003601107782638693.

  22. Muñoz-Leiva F, Sánchez-Fernández J, Montoro-Ríos F, Ibáñez-Zapata JÁ. Improving the response rate and quality in web-based surveys through the personalization and frequency of reminder mailings. Qual Quant. 2010. https://doi.org/10.1007/s11135-009-9256-5.

  23. Sauermann H, Roach M. Increasing web survey response rates in innovation research: an experimental study of static and dynamic contact design features. Res Policy. 2013. https://doi.org/10.1016/j.respol.2012.05.003.

  24. Woodford J, Norbäck K, Hagström J, Grönqvist H, Parker A, Arundel C, et al. Study within a trial (SWAT) protocol. Investigating the effect of personalised versus non-personalised study invitations on recruitment: an embedded randomised controlled recruitment trial. Contemp Clin Trials Commun. 2020. https://doi.org/10.1016/j.conctc.2020.100572.

  25. Hughes-Morley A, Young B, Waheed W, Small N, Bower P. Factors affecting recruitment into depression trials: systematic review, meta-synthesis and conceptual framework. J Affect Disord. 2015. https://doi.org/10.1016/j.jad.2014.10.005.

  26. Ljungman L, Cernvall M, Grönqvist H, Ljótsson B, Ljungman G, von Essen L. Long-term positive and negative psychological late effects for parents of childhood cancer survivors: a systematic review. PLoS One. 2014. https://doi.org/10.1371/journal.pone.0103340.

  27. Michel G, Brinkman TM, Wakefield CE, Grootenhuis M. Psychological outcomes, health-related quality of life, and neurocognitive functioning in survivors of childhood cancer and their parents. Pediatr Clin N Am. 2020. https://doi.org/10.1016/j.pcl.2020.07.005.

  28. Öhman M, Woodford J, von Essen L. Socioeconomic consequences of parenting a child with cancer for fathers and mothers in Sweden: a population-based difference-in-difference study. Int J Cancer. 2020. https://doi.org/10.1002/ijc.33444.

  29. Kukkola L, Hovén E, Cernvall M, von Essen L, Grönqvist H. Perceptions of support among Swedish parents of children after end of successful cancer treatment: a prospective, longitudinal study. Acta Oncol. 2017. https://doi.org/10.1080/0284186X.2017.1374554.

  30. Hocking MC, Kazak AE, Schneider S, Barkman D, Barakat LP, Deatrick JA. Parent perspectives on family-based psychosocial interventions in pediatric cancer: a mixed-methods approach. Support Care Cancer. 2014. https://doi.org/10.1007/s00520-013-2083-1.

  31. Kearney JA, Salley CG, Muriel AC. Standards of psychosocial care for parents of children with cancer. Pedriatr Blodd. Cancer. 2015. https://doi.org/10.1002/pbc.25761.

  32. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008. https://doi.org/10.1136/bmj.a1655.

  33. Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, et al. A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance. BMJ. 2021. https://doi.org/10.1136/bmj.n2061.

  34. Woodford J, Farrand P, Hagström J, Hedenmalm L, von Essen L. Internet-administered cognitive behavioral therapy for common mental health difficulties in parents of children treated for cancer: intervention development and description study. JMIR Form Res. 2021. https://doi.org/10.2196/22709.

  35. Ljungman L, Boger M, Ander M, Ljótsson B, Cernvall M, von Essen L, et al. Impressions that last: particularly negative and positive experiences reported by parents five years after the end of a child's successful cancer treatment or death. PLoS One. 2016. https://doi.org/10.1371/journal.pone.0157076.

  36. Ljungman L, Cernvall M, Ghaderi A, Ljungman G, von Essen L, Ljótsson B. An open trial of individualized face-to-face cognitive behavior therapy for psychological distress in parents of children after end of treatment for childhood cancer including a cognitive behavioral conceptualization. PeerJ. 2018. https://doi.org/10.7717/peerj.4570.

  37. Wikman A, Kukkola L, Börjesson H, Cernvall M, Woodford J, Grönqvist H, et al. Development of an internet-administered cognitive behavior therapy program (ENGAGE) for parents of children previously treated for cancer: participatory action research approach. J Med Internet Res. 2018. https://doi.org/10.2196/jmir.9457.

  38. Woodford J, Wikman A, Einhorn K, Cernvall M, Grönqvist H, Romppala A, et al. Attitudes and preferences toward a hypothetical trial of an internet-administered psychological intervention for parents of children treated for cancer: web-based survey. JMIR Ment Health. 2018. https://doi.org/10.2196/10085.

  39. Woodford J, Wikman A, Cernvall M, Ljungman G, Romppala A, Grönqvist H, et al. Study protocol for a feasibility study of an internet-administered, guided, CBT-based, self-help intervention (ENGAGE) for parents of children previously treated for cancer. BMJ Open. 2018. https://doi.org/10.1136/bmjopen-2018-023708.

  40. Madurasinghe VW, Eldridge S, On behalf of MRC START Group and Gordon Forbes on behalf of the START expert consensus group. Guidelines for reporting embedded recruitment trials. Trials. 2016. https://doi.org/10.1186/s13063-015-1126-y.

  41. Sheehan DV, Lecrubier Y, Sheehan KH, Amorim P, Janavs J, Weiller E, et al. The Mini-international neuropsychiatric interview (M.I.N.I.): the development and validation of a structured diagnostic psychiatric interview for DSM-IV and ICD-10. J Clin Psychiatry. 1998;59(Suppl 20):22–33.

    PubMed  PubMed Central  Google Scholar 

  42. Levine M, Ensom MH. Post hoc power analysis: an idea whose time has passed? Pharmacotherapy. 2001. https://doi.org/10.1592/phco.21.5.405.34503.

  43. Treweek S, Bevan S, Bower P, Briel M, Campbell M, Christie J, et al. Trial forge guidance 2: how to decide if a further study within a trial (SWAT) is needed. Trials. 2020. https://doi.org/10.1186/s13063-019-3980-5.

  44. Ljungman L, Hovén E, Ljungman G, Cernvall M, von Essen L. Does time heal all wounds? A longitudinal study of the development of posttraumatic stress symptoms in parents of survivors of childhood cancer and bereaved parents. Psychooncology. 2015. https://doi.org/10.1002/pon.3856.

  45. Hovén E, Ljungman L, Boger M, Ljótsson B, Silberleitner N, von Essen L, et al. Posttraumatic stress in parents of children diagnosed with cancer: hyperarousal and avoidance as mediators of the relationship between re-experiencing and dysphoria. PLoS One. 2016. https://doi.org/10.1371/journal.pone.0155585.

  46. Ley B, Rijal KR, Marfurt J, Adhikari NR, Banjara MR, Shrestha UT, et al. Analysis of erroneous data entries in paper based and electronic data collection. BMC Res Notes. 2019. https://doi.org/10.1186/s13104-019-4574-8.

  47. Hennrich P, Arnold C, Wensing M. Effects of personalized invitation letters on research participation among general practitioners: a randomized trial. BMC Med Res Methodol. 2021. https://doi.org/10.1186/s12874-021-01447-y.

  48. Cochrane A, Welch C, Fairhurst C, Cockayne S, Torgerson DJ, OTIS Study Group. An evaluation of a personalised text message reminder compared to a standard text message on postal questionnaire response rates: an embedded randomised controlled trial. F1000Res. 2020. https://doi.org/10.12688/f1000research.22361.1.

  49. Mitchell AS, Cook L, Dean A, Fairhurst C, Northgraves M, Torgerson DJ, et al. An embedded randomised controlled retention trial of personalised text messages compared to non-personalised text messages in an orthopaedic setting [version 2; peer review: 1 approved]. F1000Research. 2021. https://doi.org/10.12688/f1000research.24244.2.

  50. Cureton L, Marian IR, Barber VS, Parker A, Torgerson DJ, Hopewell S. Randomised study within a trial (SWAT) to evaluate personalised versus standard text message prompts for increasing trial participant response to postal questionnaires (PROMPTS). Trials. 2021. https://doi.org/10.1186/s13063-021-05452-w.

Download references

Acknowledgements

We wish to thank Ian Horne (Portal team member) for assistance developing Portal based study procedures and performing all data extractions on the Portal. We thank research assistant Christina Reuther and data coordinator Agnes von Essen for their assistance with data entry, data processing, and data organization. We also thanks the York Trials Unit (Department of Health Sciences, University of York) for their support developing the original study protocol and invitation letters.

Funding

Open access funding provided by Uppsala University. This work is supported by the Swedish Research Council (grant number 521-2014-3337 / E0333701 and 2018-02578), the Swedish Cancer Society (grant number 15 0673 and 17 0709, and funding via the Swedish Research Council to U-CARE, a Strategic Research environment (Dnr 2009–1093). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Affiliations

Authors

Contributions

Author contributions are written in accordance with the CRediT statement: ET: investigation, validation, data curation writing – original draft, visualisation; MÖ: formal analysis – writing – review and editing; JW: methodology, writing – original draft, supervision, project administration; LvE: conceptualization, methodology, resources, writing – review and editing, supervision, project administration, funding acquisition. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Louise von Essen.

Ethics declarations

Ethics approval and consent to participate

The ENGAGE host feasibility trial was approved by the Regional Ethical Review Board in Uppsala, Sweden (Dnr: 2017/527) and was conducted in accordance with the Helsinki Declaration, ensuring the welfare and rights of all participants, and Good Clinical Practice (GCP) guidelines. Ethical amendment for the SWAT was obtained from Swedish Ethical Review Authority August 07, 2019, ref.: 2019–03083. Eligible participants were not informed about the study within a trial, and were therefore unaware they were participating in an embedded recruitment trial and unable to provide informed consent. Online informed consent was provided by all study participants in the ENGAGE host feasibility trial.

Consent for publication

Not applicable.

Competing interests

The authors have no competing interests to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Thiblin, E., Woodford, J., Öhman, M. et al. The effect of personalised versus non-personalised study invitations on recruitment within the ENGAGE feasibility trial: an embedded randomised controlled recruitment trial. BMC Med Res Methodol 22, 65 (2022). https://doi.org/10.1186/s12874-022-01553-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12874-022-01553-5

Keywords

  • Recruitment
  • Retention
  • Study invitation
  • Study within a trial
  • Trial methodology
  • Randomised controlled trial