It’s All About MeE: Using Structured Experiential Learning (“e”) to Crawl the Design Space

Lant Pritchett, Salimah Samji,, Jeffrey Hammer
2013

Summary

This paper argues that within-project variations in design can serve as their own counterfactual, reducing the incremental cost of evaluation and increasing the direct usefulness of evaluation to implementing agencies. It suggests combining monitoring (‘M’), structured experiential learning (‘e’), and evaluation (‘E’) so as to facilitate innovation and organisational capability building while also providing accountability and an evidence base for funding agencies.

Monitoring and Evaluation (M&E) has always been an element of the accountability of implementing organizations to their funders. There has been a recent trend towards much greater rigor in evaluations to isolate causal impacts of projects and programmes and more ‘evidence-based’ approaches to accountability and budget allocations. This paper extends the basic idea of rigorous impact evaluation—the use of a valid counterfactual to make judgments about causality—to emphasise that the techniques of impact evaluation can be directly useful to implementing organisations (as opposed to impact evaluation being seen by implementing organisations as only an external threat to their funding).

The paper suggests adding structured experiential learning (‘e’) to M&E. This allows implementing agencies to search across alternative project designs using the monitoring data that provides real-time performance information with direct feedback into the decision loops of project design and implementation.

Structured experiential learning builds learning objectives into the cycle of project design, implementation, completion, and evaluation. It helps implementers first articulate the ‘design space’ of available project/programme/policy alternatives and then dynamically ‘crawl the design space’ by simultaneously trying out design alternatives and then adapting the project sequentially based on the results. The use of an integrated approach to MeE releases the tension between implementing agencies and funders by balancing the space for implementers to innovate using experiential learning with the need for rigorous evidence of effectiveness from impact evaluations.

The seven steps of MeE

  • Reverse engineer from goals—framed as solving specific problems—back to project instruments
  • Design a project
  • Admit we do not know exactly which project design will work and design a crawl of the design space to be authorised as a project
  • Identify the key dimensions of the design space
  • Select alternative project designs
  • Strategically crawl your design space: pre-specify how implementation and learning will be synchronised
  • Implement the approved sequential crawl and learn

Source

Pritchett, L., Samji, S., and Hammer, J. (2013). It's all about MeE: Using structured experiential learning (“e”) to crawl the design space Working Paper 322. Washington DC: Center for Global Development