Learning and Evidence
Pact strives to use continual learning to improve global development programming, supported by action-oriented research, monitoring and evaluation.
Are we making the right choices to achieve the impact communities want? Is our programming effective? Are the voices of the people we serve guiding our efforts?
These are just some of the critical questions that Pact asks every day as we implement evidence-based, data-driven development programming around the world. We answer them through our dedicated learning and evidence practice, which is built into all of our work to maximize our impact and ensure our accountability to those we serve. We strive to use continual learning to improve global development programming, supported by action-oriented research, monitoring and evaluation.
OUR APPROACH
Pact’s learning and evidence practice uses diverse methods – from complexity-responsive to advanced quantitative – for applied learning and adaptive management, within individual programs and across our impact areas. We believe it is our ethical obligation to use evidence, contribute to the larger evidence base and engage those we serve in decisions around the collection and use of evidence, and we believe this is fundamental to our vision for thriving, resilient and engaged communities leading their own development.
Pact’s learning and evidence practice is driven by three principles: quality, accountability and engagement.
Quality: Diverse and Rigorous Methods
Pact works to ensure that our evidence base – for program design, adaptive management, learning and evaluation – is built on fit-for-purpose designs that maximize rigor. Our methods range from rapid assessments and light-touch feedback loops that provide regular information for programming adaptation, to quasi-experimental and complexity-responsive qualitative designs that assess our contributions and test alternative approaches. All of our methods are selected based on community needs to ensure that those we serve are benefiting throughout.
Examples of our quality methods:
- In Ukraine, we used propensity score matching to assess the extent to which Pact’s program participants achieved higher civic engagement as a result of participating in our program as compared to the general population, which helped us determine whether the program designed for that context was achieving its intended goals.
- In a multi-country project in Southeast Asia, we used Process Tracing to understand whether our theory of change had greater inferential weight than plausible alternative change pathways, which supported our strategic approach for a complex goal.
- With indigenous organizations in a regional project in Latin America, we used Outcome Harvesting to discover and substantiate unanticipated outcomes, which helped determine which program activities yielded the greatest impact and which needed adjustment. As a result, the project was able to better reach targeted populations.
- In Cambodia and Colombia, we used survey research and secondary data to measure complex, multivariate governance outcomes, organized analytically in indices, to support mid-program adaptation and outcomes measurement.
Underscoring all of Pact’s evidentiary work is our commitment to making sustainable, systematic change. All of Pact’s programs undergo an annual assessment, Pact’s Quality Standards, which examines monitoring, evaluation and learning systems across five domains: (1) M&E Planning, Design and Data Collection, (2) Data Management, Security and Accessibility, (3) Data Verification, Analysis and Usage, (4) M&E Capacity and Ethics, and (5) Learning. The Standards also include subdomains related to participatory and empowering MEL processes. Results are used to support programs to strengthen their MEL systems to better leverage successes and learn from failures. Pact has been assessing and improving internal systems using this methodology for more than 10 years.
Accountability: Evidence-Based Solutions
Our approaches are based on evidence – meaning that we test our own approaches and use tested approaches. This ensures that those we serve are supported by strategies based on proven and best practices, and that any innovative practice is guided by ethical safeguards. To this end, we promote a culture of data use and curiosity. We are also committed to data transparency, and we work to promote participant ownership over their data. Pact was among the first USAID implementing partners to publish its data in the International Aid Transparency Initiative (IATI), and has done so since 2013.
Examples of Pact’s commitment to evidence-based solutions:
- Pact’s Organizational Performance Index is based on a reliability and validity test that ensured the tool is proven to measure organizational performance, and can be used consistently across the globe by a variety of actors. It is promoted by USAID globally as the recommended standard tool to measure organizational performance.
- Pact’s approach to adaptive management supports projects to tailor their MEL system to the varied operating contexts and goals of a project to ensure that we are achieving our goals. Based on program theory and complexity science, Pact’s approach helps projects determine what kinds of evidence are needed to answer what kinds of strategic questions and build data systems that support strategic decision-making.
- In addition to its commitment to IATI, for the past 11 years, Pact has publicly shared its global indicator data. Pact’s 14 global indicators include systems measurements, and are used to challenge ourselves to achieve greater impact at scale.
Engaged Communities: Empowerment through Evaluative Processes
Pact’s strategy is centered around its commitment to engaged communities, ensuring that what we do is locally led as much as possible, because we understand that impact cannot be sustained or scaled without communities at the helm. This is linked to our learning and evidence practice through our work to ensure that those we serve are able to engage in decisions about, and effectively use, evidence. For example, we support those we serve to participate in evaluative processes that enable them to contribute, understand, use and lead evaluative efforts. This is central to our approach, defined as evaluation as intervention: using applied research and learning and participatory and empowering processes, undergirded by capacity development, to engage communities in decisions about and use of the data and evidence that drive adaptive management, monitoring, evaluation and learning systems. This also contributes to sustainability, as it is focused on giving communities ownership over design and evaluative processes.
Examples of our commitment to empowerment through evaluative processes:
- Pact’s MERL Modules, a suite of training tools focused on introduction to MEL, data quality, evaluation and mobile data collection, are used to support local partners in developing their evaluative capacity so that they can build and manage the M&E systems of their own projects. Pact has been lauded in external evaluations for the quality of and results achieved via these partner training in MEL.
- Pact uses empowering approaches to engage those we serve in the design, implementation and utilization of MEL systems. In projects in Somalia and Cambodia – among others – we used Outcome Mapping to engage key change agents in project and evaluation design by supporting them to define the project’s vision and mission, to determine key project milestones, to select measurements for project processes and results and to themselves measure progress toward goals using adaptive systems.
- Pact is holding itself accountable to its engaged community commitment. We are launching a new global indicator to measure the extent to which Pact’s projects are implementing engaged community processes throughout their project lifecycles and a strategy-level key performance indicator on the quality of Pact as a partner in these locally led processes.
Stay Updated. Subscribe Now.
Pact's e-newsletter offers the latest on our efforts around the world to build thriving, resilient and engaged communities that are leading their own development.
Learning and Evidence Work In Action
-
Blog
A new tool in the Outcome Harvesting toolbox: How Pact is collaborating to improve an essential evaluation method
Sep 26, 2024 -
Blog
PeaceCon 2024: Embracing complexity to advance social cohesion
Sep 09, 2024 -
Blog
Pact presents on locally embedded, sustainable solutions at the 11th AfrEA International Conference
May 16, 2024
Learning and Evidence Resources
-
Guidance: The participatory design and co-creation guidance for the CSM-STAND consortium
-
Best practices in programming for orphans and vulnerable children and adolescent girls and young women: A compendium of interventions and lessons learned from the USAID Insika ya Kusasa project
-
Holding ourselves accountable for locally led development
-
Promise and accountability for locally led development: Lessons and recommendations from Pact’s global engaged communities assessment
-
Brief: Talking about co-creation: Pre-award co-design with the CSM-STAND consortium
MEET OUR EXPERTS
-
Alysson Akiko Oakley
Vice President, Learning, Evidence & Impact
-
Daisy Kisyombe
Manager, Learning, Evidence, and Impact
-
David Muturi
Global Administrator, Strategic Information Systems
-
Lauren Serpe
Technical Director, Learning, Evidence & Impact
-
Molly J Wright
Senior Advisor, Results & Measurement
-
Tripti Pande
Learning, Evidence and Impact Advisor