Q&A: Why data-driven, evidence-based programming is critical in global development
What do communities truly want? How can we help them to reach these goals? Did our interventions work? How can we improve them? In global development, all of these questions can be answered through the use of data and evidence. If we are committed to transforming lives in ways that are lasting and measurable, then we must also be committed to data and evidence. Here, Alysson Akiko Oakley, our vice president for learning, evidence and impact, discusses what this means at Pact and how it bolsters our focus on engaged communities.
Q: Pact’s programming is evidence-based. What does this mean?
A: Being evidence-based is equally an approach, a process, a method, a mindset and a commitment. Evidence is a collection of information, analyzed in a deliberate manner, that together suggests whether something is valid. It can help answer questions around what should work, and what is working, for whom, why and how.
At Pact, being evidence-based means being intentional about using evidence – new, emergent or established – to make decisions about program design and implementation, and to complete the cycle by contributing to the broader evidence-base so that others may equally benefit. Central to this process is critical reflection and learning. In Pact’s work, this is grounded in adaptive management and rigorous evaluation processes that provide relevant data for timely decision-making.
Why are data and evidence so important in global development programming?
In short, evidence is critical to ensuring that programming is effective. It is how we answer key questions about the needs and wishes of communities, our strategies and results, and whether we are truly listening to and serving communities. To understand and validate the impact of our work and make informed decisions, data and evidence must be our foundation.
How would you describe Pact’s approach to data and evidence?
Behind every data point is a person, a community, a life and a valuable story. At Pact, we conceptualize our evidence and learning practice as an ethical obligation to ensure our projects are designed, delivered, assessed and sustained in a manner that leverages the most relevant evidence to maximize the benefit for those we serve.
People are not numbers. We must ensure that the time we share and the relationship we have with the communities we serve are grounded in humility, respect and appreciation, which means delivering the highest quality programming. This is ingrained throughout Pact’s work, and equally so in our evidence and learning practice.
"To understand and validate the impact of our work and make informed decisions, data and evidence must be our foundation."
What does this look like in practice?
In practice, this comes in two forms. The first is in the use of rigorous designs and methods to ensure the evidence we are using is accurate and relevant to the challenge at hand. For example, in Colombia we needed to measure progress in citizen security. We needed to operationalize this complex concept in a manner that provides a clear sense of how a community is faring to make strategic decisions around programming. To do so, we developed a composite index measured through community surveys and other methods, providing rigorous and granular assessment of citizen security from a holistic perspective. As another example, we are committed to testing our approaches. Our organizational performance index (OPI), for example, has undergone reliability and content validity tests.
The second form is in taking an “evaluation as intervention” approach, which connects to Pact’s guiding star of engaged communities. This approach empowers program participants and communities in evaluative activities so that they lead in decisions around metrics of success, how data are interpreted, what is appropriate knowledge and relevant evidence, and what should contribute to sectoral learning and good practice. This is an actively transformative development approach that shifts power to those most impacted and requires commitment to and investment in people’s lives and their lived experiences.
What advice would you share with other practitioners?
The best methodologies are worthless if there is no appetite for critical reflection and learning. Similarly, appetite must be stimulated by good examples of how evidence can support decision-making, including effective communication of evidence through data visualization or storytelling. Too often, there is a mismatch between what we need to know, when we need to know it, how we can understand it and what we have available.
My advice is to start with small evidentiary wins—for example, leveraging existing data sources—with early adopters who serve as champions and show how useful evidence can be to effective programming. Then expand by building a learning system that incentivizes and rewards teams that are evidence-based and utilization-focused, while also providing supportive resources. This could be an evaluation or research team that can match types of methodological rigor with operational realities, leadership that is proactive in encouraging staff and helping to clear bottlenecks, and mindsets and processes that celebrate and promote learning. This does not happen naturally, especially in large organizations, and therefore must be an ongoing, proactive effort. All of this must be undertaken in a manner that never loses sight of the purpose of evidence, which at Pact is to ensure that what we are doing is valid and implemented in a manner that is transformative.