Skip to main content

Table 5 Programme reporting standards (PRS) for SRMNCAH – version 1.0

From: Programme Reporting Standards (PRS) for improving the reporting of sexual, reproductive, maternal, newborn, child and adolescent health programmes

The PRS is a tool that can be used for reporting on the planning, implementation and evaluation of SRMNCAH programmes. The PRS can be used throughout the programme lifecycle, guiding not only the reporting of processes and outcomes but also the programme design and development.
Instructions for using the PRS
• For each reporting item, provide the source and page number where the information can be located.
• If the information provided for an item is deemed insufficient, state “not reported”.
• While users of the PRS should consider the relevance of all items, some items may not be applicable to the programme or the specific report. If an item is irrelevant or beyond the scope of the programme, indicate “N/A”
• Larger programmes may need to break their reporting into more specific components and topics.
Section and item name Item description Reported (source and page)
Not reported
N/A
Programme Overview Why was the programme started and what did it expect to achieve?  
1. Rationale and objectives a. Programme rationale, i.e. why the programme was initiated (nature and significance of the issue or problem being addressed).  
b. Goals and objectives.  
c. Anticipated short- and long-term effects of the programme at different levels (e.g. individual, household, facility, organization, community, society).  
2. Start and end date a. Planned start and end date of the programme.  
b. Delays and/or unexpected end of the programme along with reasons why.  
3. Setting and Context a. Where the programme took place, e.g. country name(s), specific locations, urban/rural environments.  
b. Overview of the context (e.g. political, historical, sociocultural, socioeconomic, ethical, legal, health system) pertinent to the programme.  
4. Stakeholders a. Programme target population (key sociodemographic characteristics e.g. age, gender, ethnicity, education level)  
b. Implementing organization(s).  
c. Partners and other stakeholders (e.g. local authorities, community leaders).  
d. How the different stakeholders were involved in programme development and/or implementation.  
5. Funding source(s) Name of programme donor/funding source(s).  
6. Theory of change and/or logic model Theory of change, assumptions, and/or logic model framework underlying the programme, with details for how this guided the programme design, implementation and evaluation plans.  
7. Human rights perspectives a. If and how gender, equity, rights and ethical considerations were integrated into the programme.  
b. If and how an accountability framework was adapted to define the programme’s commitments and how it would be accountable for these commitments.  
Programme Components and Implementation What did the programme do and how?  
8. Programme planning How activities were decided upon and why (e.g. based on results of a situational or stakeholder analysis, identification of current gaps and needs in programming, or criteria such as the evidence-base, scalability, sustainability of activities).  
9. Piloting Piloting of the programme activities elsewhere or within the programme, and if so how, when, where, by whom and with what results.  
10. Components/Activities (Please repeat for each component) Detailed description of the core programme components/activities, including:
What was done
How (implementation methods/delivery processes/approaches).
When (frequency, intensity, duration).
By whom (characteristics, skills, training and responsibilities of implementing personnel (i.e. staff, providers, volunteers).
For whom (target population for each activity).
• Support materials used and where these can be accessed.
 
11. Quality assurance mechanisms a. Mechanisms used to ensure the quality in the implementation of activities (e.g. supervision and support of personnel, refresher trainings, product quality checks).  
b. Efforts used to increase and sustain participation of stakeholders (e.g. incentives).  
Monitoring of Implementation How did the programme keep track of what was done?  
12. Monitoring mechanisms How the programme implementation process was monitored, including the collection and analysis of indicators to identify problems/solutions.  
13. Coverage/Reach and Drop-out a. Uptake (utilization) each programme activity reported by key sociodemographics characteristics.  
b. Coverage of the programme activities, including differential reach in or outside of the target population.  
c. Non-participation and dropout among the target populations, along with key sociodemographics and reasons for why.  
14. Adaptations a. Whether the programme was delivered as intended, e.g. discrepancies between programme design vs. the actual implementation of components, degree of match between programme content and theory of change.  
b. On-going adaptation of the programme activities to better fit the context, and the fidelity to the activity plan.  
15. Acceptability Acceptability of the programme among stakeholders, e.g. assessment of whether the programme was considered to be reasonable and relevant.  
16. Feasibility Assessment of the feasibility of the programme, e.g., the extent to which it could be carried out in the particular context or by the specific organization.  
17. Factors affecting implementation Description of key barriers and facilitators to programme implementation, including contextual factors (e.g. social, political, economic, health systems).  
Evaluation and Results How was the programme evaluated, and what were the findings?  
18. Evaluation a. Type of evaluation(s) conducted (e.g. process evaluation and/or outcome evaluation, quantitative or qualitative).  
b. Evaluation methods. How, when (timing and phases e.g. baseline, midline, end line) and by who the programme was evaluateda.  
19. Results a. Description of the programme results (key process, output, outcome indicators), differentiating between short/mid/long-term effects.  
b. Whether the programme effects differed across key sociodemographic characteristics and/or geographical areas.  
c. Whether the programme had unexpected effects (beyond what was anticipated in the design) on the target population, health services and/or the communities  
20. Costs a. Summary of the required resources for implementation (e.g., financial, time, human resources, materials, administration)  
b. If and how a cost analysis or cost-effectiveness analysis was conducted.  
Synthesis What are the key implications?  
21. Lessons learnt Appraised weaknesses and strengths of the programme, what worked well and what can be improved.  
22. Sustainability Reflections on the sustainability of the programme over time, e.g. the expected ability to maintain the programme activities, engagement of stakeholders, outcomes achieved, effects, partnerships.  
23. Scalability Description of the scale-up of all or some programmes activities, or any plans for scale-up.  
24. Possibilities for implementation in other settings Reflections on the context-dependence of the programme and (and with what degree of effort) it could be implemented in/adapted to other settings.  
Additional information (optional) References and/or links to additional sources of information in relation to the programme.  
Any additional comments related to the items reported above  
  1. aReports of research studies should provide further details in line with guidelines for the reporting of the specific study design. Different guidelines are available in the EQUATOR database (http://www.equator-network.org/)