We Measure Results

Photo of health worker in Ethiopia by Sala Lewis
Photo of family planning monitoring
Photo in Peru by Pathfinder International
Photo in Egypt by Pathfinder International

Research and Metrics at Pathfinder

Pathfinder believes that high quality data lead to better programs, better accountability to donors, and a better understanding of "what works."

Pathfinder programs aim to produce and use meaningful data to assure program quality and effectiveness, and to share insights from our work with the international public health community. Metrics are indispensable to Pathfinder programs. These standards of measurement enable us to determine how much progress projects have made and quantify evidence of what we have learned.

Our Research and Metrics staff focus on building capacity for data-based evaluation for program learning, ensuring that the data Pathfinder programs collect are valid, reliable, and appropriate for their intended use. Pathfinder ensures that evidence for evaluation is available, stands up to scrutiny, and is understood by program staff, implementing partners, government counterparts and donors. In our definition, "data" are objective measurements obtained by applying rigorous quantitative and qualitative methods.

Pathfinder’s Evaluation Strategy: Building a Chain of Evidence

Pathfinder’s evaluation strategy is simple: follow the logic of the project to develop a “chain of evidence.”

We begin by periodically tracking output indicators to measure progress and completion of project activities. For example, did we train enough health workers to provide services in the facilities we support? Tracking outputs can tell us how efficiently the work plan is being implemented, and provide insight into how best to effectively manage the project. Next, we measure the direct effects of these activities, to assess whether they ‘worked’ in the ways intended. For example, did training of providers result in improved skills, improve the quality of services they provide, and increase availability of the services? Effectiveness indicators can usually be measured at the point of service and so can also be tracked periodically.

Finally, at the start of a project and again near the project’s end, we measure indicators of the larger outcomes the project aims to change. Measures of these outcomes, such as contraceptive prevalence, are often collected in household surveys, among the people expected to benefit from the project. Outcome measures are important, because it is only by looking at changes in these measures that we learn whether a particular constellation of activities contributes to program objectives.

Together these three sets of measures build a chain of evidence, establishing the plausible contribution of the project to observed changes or improvements in outcomes and, by extension, to achievement of the program goal. Each indicator in the chain is an essential component of project evaluation. We ensure a balance between counting outputs, measuring the direct effects of these outputs, and measuring the overall outcomes for program beneficiaries. Higher level impacts such as changes in mortality or fertility are affected by many factors outside the control of a program and often cannot be detected during the timespan of a single project. We do not usually attempt to measure these indicators of program goals, but instead try to balance what we might ideally measure with what is feasible to measure within the resources of the project.

Performance Monitoring

Pathfinder has an efficient, data-based system for monitoring project performance, capturing data at project level, and aggregating and interpreting it to assess progress of each country program as well as Pathfinder’s global portfolio. The system is built around Key Indicator Tables that are used by each Pathfinder project to compile and analyze quarterly progress against targets for performance indicators and permits managers to review program progress and make necessary adjustments based on quantitative data. Studies to Enhance Program Learning

At Pathfinder, we conduct several types of research to enhance program learning:

  • evaluation research–data-based evaluation
  • operations research (assessing different ways to deliver services or solve programmatic problems), and
  • studies designed to contribute to the evidence base for reproductive health programming

Pathfinder’s Research and Metrics team is currently implementing a Board-funded research project, Strengthening the Evidence Base to Improve and Expand Reproductive Health Programs. Pathfinder is committed to enhancing program learning by sharing research results through our Research and Evaluation Working Papers and peer-reviewed journal articles, conference presentations and posters. Staff Capabilities

At headquarters, the five-member Research and Metric Unit’s expertise lies in developing methods, measures and tools to collect data that are valid, reliable, and useful for monitoring performance and generating evidence to learn from our work. The team is multidisciplinary with training in sociology, demography, epidemiology, biostatistics and qualitative research methods.

Simple Data video thumbnail

Related Publications

March 2014

Health of People and Environment Lake Victoria Basin Baseline Study: Synthesis Report

In 2012, HoPE LVB conducted a baseline study to inform project design and determine baseline values for key outcome indicators.

November 2013

Evaluating the Coverage and Cost of Community Health Worker Programs in Nampula Province in Mozambique

In 2012, Pathfinder conducted a study in Mozambique to explore whether community health workers who provide an integrated package of services communicate with beneficiaries about family planning, and what actions women take based on these messages.

June 2013

Extending Service Delivery – Family Planning Initiative (ESD-FPI): Baseline Report

The objectives of this baseline survey were to assess the knowledge and practice of family planning, examine the use of facility-based services, identify channels through which information about family planning could be shared, and inform decision-making.

March 2013

Addressing Unmet Need for Contraception Among HIV-Positive Women

This baseline study was conducted to assess the performance of the ARISE (Enhancing HIV Prevention for At-Risk Populations) project in Uganda.

January 2013

PRACHAR: Advancing Young People’s Sexual and Reproductive Health and Rights in India

This technical brief summarizes the evolution of PRACHAR, describes the intervention model and key evaluation results that informed each phase, and highlights next steps for dissemination and advocacy based on 11 years of project learning.

July 2012

Strengthening Strategic Health Information Systems in Kenya's North Eastern Province

This technical brief discusses steps taken by the project to meet challenges to the use of strategic health information in Kenya’s North Eastern Province, and provides recommendations for future similar efforts in comparable contexts.

January 2012

Strengthening Your Organization: A Series of Modules and Reference Materials for NGO and CBO Managers and Policy Makers - monitoring-and-evaluation and MIS

Strengthening Your Organization: A Series of Modules and Reference Materials for NGO and CBO Managers and Policy Makers - Monitoring and Evaluation and MIS

January 2012

The Effect of Reproductive Health Communication Interventions on Age at Marriage and First Birth in Rural Bihar, India

This paper describes the results of a survey of participants in an adolescent education program implemented by the PRACHAR project in rural Bihar.

Scroll to top