Evaluation at IDRC
Promoting useful, high-quality evaluations
At IDRC, we expect an evaluation’s purpose to determine how it is done. Our approach is framed in utility: evaluations must have a clear use and respond to the needs of the user(s), whether management, a program, a donor, or research team.
We equally value the use of rigorous methods. We do not promote or expect any particular evaluation design or focus. Our approach helps users select the most appropriate content, model, methods, theory, and applications for their evaluation needs. The quality of the evaluation is judged on its accuracy, ethics, feasibility, and use.
Our guiding principles
- The decision to evaluate should be strategic, not routine.
- Evaluative thinking adds value to a project or program from the outset.
- Evaluation should be an asset to those being evaluated.
- Evaluation should enlist the participation of relevant users.
- Evaluation processes should develop capacity in evaluative thinking and evaluation use.
- Evaluation should meet quality and ethical standards.
- Learning about the theory, practice, and findings of evaluation should be
documented and shared.
For more information on IDRC's overall approach to evaluation, guiding principles, components, and roles within our decentralized system, read Evaluation at IDRC (PDF, 320KB).
We also report annually on evaluation activities and main evaluation findings through our Annual Corporate Evaluation Report.
IDRC's evaluation team
Leading evaluative thinking internally
To be effective at supporting development research, IDRC must ensure that staff are knowledgeable and continue to learn. As a result, they are key partners in IDRC’s evaluation system. Our evaluation team works with staff to encourage evaluative thinking and high-quality program-led evaluations.
Our team is the steward of strategic evaluations and external program reviews at IDRC. Strategic evaluations look at cross-cutting issues about key results and programming modalities. External reviews provide our Board of Governors with an assessment of each program's performance, its research findings, and its outcomes. Together, these corporate evaluations provide rich learning opportunities and evidence of the strengths and weaknesses of IDRC’s programming.
Building the field of evaluation
Our evaluation team works with the international evaluation and development research community by supporting research on evaluation approaches and methods and by building the field of evaluation in the global South. This approach contributes to strengthening the role and relevance of evaluation in development and the ability of developing-country evaluators to address knowledge gaps and development challenges in specific contexts.
Due to the complex nature of research for development, most conventional approaches to evaluation fail to meet the needs of this field. These methods tend to focus solely on what happened, rather than on how, where, and why change occurred. IDRC’s evaluation team partners with others to develop and share evaluation approaches and methodologies that are embedded in complexity thinking—in other words, address complex realities—and that challenge persistent social inequalities.