We Measure Results
Research and Metrics at Pathfinder
Pathfinder believes that high quality data lead to better programs, better accountability to donors, and a better understanding of "what works."
Pathfinder programs aim to produce and use meaningful data to assure program quality and effectiveness, and to share insights from our work with the international public health community. Metrics are indispensable to Pathfinder programs. These standards of measurement enable us to determine how much progress projects have made and quantify evidence of what we have learned.
Our Research and Metrics staff focus on building capacity for data-based evaluation for program learning, ensuring that the data Pathfinder programs collect are valid, reliable, and appropriate for their intended use. Pathfinder ensures that evidence for evaluation is available, stands up to scrutiny, and is understood by program staff, implementing partners, government counterparts and donors. In our definition, "data" are objective measurements obtained by applying rigorous quantitative and qualitative methods.
Pathfinder’s Evaluation Strategy: Building a Chain of Evidence
Pathfinder’s evaluation strategy is simple: follow the logic of the project to develop a “chain of evidence.”
We begin by periodically tracking output indicators to measure progress and completion of project activities. For example, did we train enough health workers to provide services in the facilities we support? Tracking outputs can tell us how efficiently the work plan is being implemented, and provide insight into how best to effectively manage the project. Next, we measure the direct effects of these activities, to assess whether they ‘worked’ in the ways intended. For example, did training of providers result in improved skills, improve the quality of services they provide, and increase availability of the services? Effectiveness indicators can usually be measured at the point of service and so can also be tracked periodically.
Finally, at the start of a project and again near the project’s end, we measure indicators of the larger outcomes the project aims to change. Measures of these outcomes, such as contraceptive prevalence, are often collected in household surveys, among the people expected to benefit from the project. Outcome measures are important, because it is only by looking at changes in these measures that we learn whether a particular constellation of activities contributes to program objectives.
Together these three sets of measures build a chain of evidence, establishing the plausible contribution of the project to observed changes or improvements in outcomes and, by extension, to achievement of the program goal. Each indicator in the chain is an essential component of project evaluation. We ensure a balance between counting outputs, measuring the direct effects of these outputs, and measuring the overall outcomes for program beneficiaries. Higher level impacts such as changes in mortality or fertility are affected by many factors outside the control of a program and often cannot be detected during the timespan of a single project. We do not usually attempt to measure these indicators of program goals, but instead try to balance what we might ideally measure with what is feasible to measure within the resources of the project.
Pathfinder has an efficient, data-based system for monitoring project performance, capturing data at project level, and aggregating and interpreting it to assess progress of each country program as well as Pathfinder’s global portfolio. The system is built around Key Indicator Tables that are used by each Pathfinder project to compile and analyze quarterly progress against targets for performance indicators and permits managers to review program progress and make necessary adjustments based on quantitative data. Studies to Enhance Program Learning
At Pathfinder, we conduct several types of research to enhance program learning:
- evaluation research–data-based evaluation
- operations research (assessing different ways to deliver services or solve programmatic problems), and
- studies designed to contribute to the evidence base for reproductive health programming
Pathfinder’s Research and Metrics team is currently implementing a Board-funded research project, Strengthening the Evidence Base to Improve and Expand Reproductive Health Programs. Pathfinder is committed to enhancing program learning by sharing research results through our Research and Evaluation Working Papers and peer-reviewed journal articles, conference presentations and posters. Staff Capabilities
At headquarters, the five-member Research and Metric Unit’s expertise lies in developing methods, measures and tools to collect data that are valid, reliable, and useful for monitoring performance and generating evidence to learn from our work. The team is multidisciplinary with training in sociology, demography, epidemiology, biostatistics and qualitative research methods.
This tool help users conduct a concise assessment of a partner organization’s (or potential partner’s) strengths and weaknesses, helping to identify areas where technical assistance will be needed to successfully implement a project.
Addressing Unmet Need for Contraception among HIV-Positive Women: Endline Survey Results and Comparison with the Baseline
This is a report of a facility-based endline survey that was conducted as part of a program evaluation to assess the Arise—Enhancing HIV Prevention for At-Risk-Populations project in Uganda.
Addressing Unmet Need for Contraception among HIV-Positive Women: A Qualitative Study of the Arise Project in Uganda
This report presents the findings from a qualitative study conducted in January 2014 in Lango and Teso regions of Uganda among Arise Project beneficiaries and service providers.
In 2012, HoPE LVB conducted a baseline study to inform project design and determine baseline values for key outcome indicators.
Evaluating the Coverage and Cost of Community Health Worker Programs in Nampula Province in Mozambique
In 2012, Pathfinder conducted a study in Mozambique to explore whether community health workers who provide an integrated package of services communicate with beneficiaries about family planning, and what actions women take based on these messages.
The objectives of this baseline survey were to assess the knowledge and practice of family planning, examine the use of facility-based services, identify channels through which information about family planning could be shared, and inform decision-making.
This baseline study was conducted to assess the performance of the ARISE (Enhancing HIV Prevention for At-Risk Populations) project in Uganda.
This technical brief summarizes the evolution of PRACHAR, describes the intervention model and key evaluation results that informed each phase, and highlights next steps for dissemination and advocacy based on 11 years of project learning.