Evaluation Methods and Resources

At IDRC, we expect an evaluation’s purpose to determine how it is done. Because our approach is use-focused, our evaluations can have a number of purposes (formative, summative, developmental), any type of data (quantitative, qualitative, mixed), any design (naturalistic, experimental), and any focus (processes, outcomes, impacts, cost-benefit, etc.). The evaluation`s quality is judged on its accuracy, ethics, feasibility, and use.

Tested methods

Where evaluation approaches didn’t already exist, we have worked with some of the best experts to create and test new ones. These methods are rigorous, learning-oriented, and relevant to the complexities of development research. You might find them useful:  

Highlights and Guidelines

Learn more about how to improve the quality and consistency of evaluations by reading our series of evaluation highlights and evaluation guidelines.

Publications for Evaluation


Results  1  -  10  of  45  for  Publications

Writings from South Asia

This is a first-of-its-kind collection of writings by evaluation professionals working in South Asia. It analyses and documents the status of, and challenges for, development evaluation in this region. The collection covers three critical dimensions...

ODI launches RAPID Outcome Mapping Approach online guide

The UK-based Overseas Development Institute has released its online guide to understanding, engaging with, and influencing policy. ROMA: A guide to policy engagement and influence contains a suite of tools that can be used to unpack complex problems...

Outcome Mapping

The Evaluation Unit and a number of IDRC programs worked with Barry Kibel of the Pacific Institute for Research and Evaluation to adapt his Outcome Engineering approach to the development research context. In 2001, a methodology called "Outcome...

Organizational Assessment

The Evaluation Unit, in partnership with Universalia Management Group, has worked in organizational assessment for over 15 years. We have developed an organizational performance assessment framework, including a guide for self-assessment. ...

Gender Evaluation Methodology (GEM)

The Association for Progressive Communications Women's Networking Support Programme (APC WNSP) developed GEM and GEM-related resources to facilitate learning about using information and communication technologies (ICTs) for gender equality. GEM...

Accountability Principles for Research Organizations

Accountability Principles of Research Organisations provides a framework for establishing accountability good practices and principles for policy-oriented research organizations working in developing countries. The core principles of participation...

An Evaluation of the PRODAR Network

Since the early 1970s, IDRC has been supporting research projects in rural areas of developing countries. The objective of many of these projects has been to solve problems related to postharvest activities such as food handling, storage, processing...

A Guide to Facilitated Participatory Planning

This book draws on the work of thinkers and doers throughout the world who have grappled with the challenge of planning complex institutions, especially health systems and development projects. Their problem: Conventional planning methods often do...

In Conversation: Michael Quinn Patton

Many development programs are evaluated to determine how effective and useful they are. But how effective and useful are the evaluations themselves? Internationally renowned evaluator, Michael Quinn Patton, recently came to IDRC to discuss his...


Although we have witnessed a steady growth in the provision of information services in developing countries, a number of fundamental questions remain unanswered. The people of these countries question the relevance and appropriateness of the...

Previous  1 2 3 4     Next
Knowledge, innovation, and solutions to improve the lives of people in the developing world
Bookmark and Share
Flickr YouTube Facebook Twitter