Skip to main content

Table 4 Value created by using process evaluation knowledge

From: What do we want to get out of this? a critical interpretive synthesis of the value of process evaluations, with a practical planning framework

Theme

Sub-theme

Examples

Process evaluation knowledge supporting implementation of interventions into practice

Improving implementation during the evaluation

Continuously check and make adjustments to keep interventions ‘on track’ [93] by monitoring and correcting fidelity, adaptations, reach, and/or dose [24, 44, 48, 77, 93, 94]

Developing interventions more likely to be implemented successfully

Formative process evaluation during piloting enhances development of sustainable and adaptable intervention, and develops robust implementation processes increasing likelihood of effectiveness in main trial [46]

Formative process evaluation over entire evaluation allows implementation to be optimised and strengthened in real time [60, 89, 95]

Informing about transferability to other contexts post-evaluation

Understanding of the required conditions for interventions to have desired effects, and assessment of intervention transferability to different settings [35, 40, 82, 96]

Enable judgement about whether mechanisms would have the same effect in different settings [1, 97]

Acceptability of interventions [98]

Responses of different subgroups [27]

Informing how best to implement the intervention post-evaluation

Necessary conditions for implementation to be effective in systems, such as new policies [95], allocation of sufficient resources [93]

Necessary training and support for intervention deliverers [91, 99,100,101]

How to tailor and adapt interventions in different contexts [40, 53, 62, 81, 83, 99]

Strategies and monitoring systems to support implementation [46, 99, 102,103,104]

Informing about relative importance and optimisation of different intervention components [6, 31, 40, 74, 104]

Describing how flexible interventions were delivered in evaluation to aid replication [12]

Assessment of extent to which intervention is deliverable in practice in the intended way [86]

Enhancing likelihood of intervention being implemented in practice post-evaluation

Engaging stakeholders during process evaluation may contribute to successful implementation by those stakeholders after the evaluation [28, 105]

Understanding processes of integrating interventions in dynamic complex settings [106]

Providing evidence of feasibility and help convince clinicians and policymakers to adopt controversial but effective interventions [13]

Highlighting potential implementation difficulties [13]

Providing evidence of how intervention works in different contexts may mean more likely to be adopted in practice [96]

Process evaluation knowledge informing development of interventions

Intervention modification

Optimisation through revealing reasons for positive outcomes [53, 84]

Modification to avoid potentially harmful unintended effects [42, 107]

Improvements to acceptability and usability [108, 109]

Remove or modify intervention components [70, 91, 99, 110]

Inform effective tailoring of interventions to different populations and contexts [62, 84, 99, 111]

Improvements to intervention design [86]

Developing intervention theory

Develop, test, and refine intervention theory and causal mechanisms [33, 53, 83, 96, 112]

Future intervention design

Process evaluations providing insights into reasons for ineffective interventions can provide knowledge to inform development of future interventions [90]

Process evaluation knowledge improving practice and outcomes

Improvements during the evaluation

Formative process evaluations facilitated intervention development and therefore improved practice and outcomes [29, 30, 37, 94, 113]

Improving standard care at trial sites by exposing gaps in current provision [12]

Designing quality process evaluation from evaluation outset can help examine programme logic and potential for additional positive outcomes [114]

Participation in process evaluation may have helped intervention reach goal of empowering youth [32]

Improvements after the evaluation

Process evaluation knowledge ultimately can improve practice and outcomes in groups targeted by interventions through:

• Facilitating timely implementation of effective interventions into practice [96, 103, 114]

• Providing understanding of how interventions work [115]

• Enhancing understanding of complexity [2]

Knowledge about patient experience may help clinicians and patients decide which intervention to choose in practice if both are found to have similar effects in an RCT [13]

Improving patient centred-care by considering patient views [116]

Revealing and addressing inequalities in participant responses which may be masked by aggregate positive trial results [1]

Process evaluation knowledge contributing to wider knowledge

Wider knowledge about interventions

Inform wider theories about similar interventions [57, 117,118,119,120]

Generate questions and hypotheses for future research [9]

Highlight need for other interventions to target different subgroups [121]

Wider knowledge about implementation science

Knowledge about successful implementation strategies and behaviour change techniques [33, 71, 109, 122, 123]

Understanding variation in outcome results according to factors associated with staff delivering interventions may be useful to inform wider research, policy, and practice [55, 81]

Contribute insights into what facilitates implementation in public health programs [114]

Wider knowledge about contexts

Contribute to the evidence base about which types of interventions are fruitful to pursue, modify, or should be avoided within certain fields of practice [26, 47]

Wider knowledge about research methods

Methodological and theoretical contributions to process evaluation literature [1, 27, 29, 35, 84, 96, 99, 124, 125]

Knowledge about optimal trial designs [90]

Financial value of process evaluation knowledge

Reducing costs of interventions

Identifying the active ingredients of interventions to inform removing minimally effective components [6, 40, 57]

Demonstrating feasibility of implementing intervention in practice without a research grant [93]

Justifying cost of evaluations

By explaining outcome results process evaluations may help justify money spent on trials with outcomes that are not positive [28, 126]

Justifying costs of the intervention to funders [127]

Informing financial management in wider contexts

Explaining outcome results may help avoid future expensive mistakes in interventions, theory, and research [67, 92]

Understanding the mechanisms of interventions, and how they may affect other areas of health systems, may inform wider health investment [128]

Avoiding research waste

Better provision of information on the influence of context on trial outcomes may help stop trial findings being ignored by policymakers and practitioners [129]

The role of process evaluation knowledge in increasing the likelihood of interventions being successfully transferred to practice may be used to justify the expense of process evaluations [67]

Ensuring interventions implemented correctly during evaluations

Formative monitoring and correction of implementation may avoid financial waste through researching interventions which are not implemented correctly [64, 118]

Value of process evaluation knowledge to the outcome evaluation

Adding knowledge not provided by the outcome evaluation

Unpacking an aggregate positive or negative outcome result which may mask considerable differences in individual benefit of interventions [1, 31, 82]

Reasons for variability in outcomes and implementation [95]

Qualitative process evaluations may discover unexpected outcomes that are difficult to predict or access using experimental methods [33, 63]

Investigating contextual factors not taken into account by outcome evaluation [33, 82]

Explaining why interventions do or do not show effect in an outcome evaluation [58, 117]

Providing knowledge about how interventions work in practice, including aspects of intervention of which investigators unaware [130], which aspects of intervention most important [109]

Providing richer knowledge of how change occurred in ways that mattered to participants [33]

Factors contributing to intervention implementation, including negotiations and compromises necessary for successful implementation [34]

Unanticipated benefits of interventions [95]

*Negative qualitative findings potentially demoralising trial team [92]

Increasing the credibility of outcome evaluation methods

By adding knowledge to address criticisms of limitations of RCTs [81], process evaluations improve the science of RCTs, and help prevent abandonment of RCTs in favour of less rigorous non-experimental or non-randomised research methods [88]

Perceptions that process evaluations address tendencies of experimental evaluators to not take into account vital information [1, 38, 54, 55]

Improving or interpreting the quality of outcome evaluation results

Providing summative information about external validity [126, 131] and internal validity [111]

Avoiding ‘type III errors’, or ‘false-negative’ trial results, where lack of effect is caused by poor implementation [87, 131]

Formative process evaluations may help avoid erroneous trial results through maximising fidelity and therefore internal validity [48, 98, 119]

Providing information to enable selection of most appropriate statistical methods for outcome evaluation [5]

Providing knowledge about changes in implementation over time [59] and learning curve effects [55] to help interpret outcome results

Investigating potentially problematic areas of pragmatic trial design and conduct to support validity of outcome results [12]

Through qualitative participatory process evaluation achieving ‘a more robust, rigorous and reliable source of evidence than the single stories that conventional quantitative impact evaluations generate’ [33]

Improving outcome evaluation methods

Formative process evaluation enabling change to outcome study design prior to commencement [95]