Evaluating digital citizen engagement: A practical guide

Digital Engagement Evaluation Team
2016

Summary

This guide provides practical steps to assess the extent to which digital tools have contributed to citizen engagement, and to understand the impact that the introduction of technology has had on the engagement processes. The guide draws on examples and lessons from case studies from Brazil, Uganda, Cameroon and Kenya. It can be used at many stages— to inform project design, as a tool for continual learning and improvement, and for undertaking mid-term or post-hoc evaluations. The guide was written primarily for practitioners – including task team leaders at the World Bank Group, project or program delivery staff at civil society organizations, and internal or external evaluators or consultants throughout the project cycle – but is also a helpful resource for anyone seeking to better understand the role of digital technology in citizen engagement.

The guide presents five ‘lenses’ through which digital citizen engagement (DCE) interventions might be viewed while undertaking an evaluation:

  • Objective: What are the goals of the initiative, and how well is the project designed to achieve those goals?
  • Control: Which actors exert the most influence over the initiative’s design and implementation, and what are the implications of this?
  • Participation: Which individuals participate in the initiative, and to what extent is their participation in line with their needs and expectations?
  • Technology: How appropriate was the choice of the technology, and how well was the technology implemented?
  • Effects: What effects did the project have, and to what extent can this impact be attributed to technology?

The guide is structured around the five stages of an evaluation lifecycle:

  • Scoping: This stage lays out the groundwork for the design and implementation of the evaluation by investing time and resources into understanding the project and its context, the operating environment and the recent developments and insights from the DCE evaluation field.
  • Designing: This stage involves agreeing the focus, goals and objectives, designing the evaluation questions, and deciding on an appropriate approach and method to achieve those goals.
  • Planning & Implementing: This section describes how the design process now moves to a more detailed level to decide what tools to use within the broad method for collecting new data, whether or not to use digital tools to collect new data, and how data collection can be implemented.
  • Analysing: This stage discusses how the DCE data can be analysed and provides pointers for quantitative, qualitative and mixed methods of analysis.
  • Sharing, Reflecting & Learning: This final section focuses on testing the findings, writing up the results and analysis of a DCE evaluation, considers methods of sharing findings (including discussing opening up evaluations and their data), and reflecting and learning on the lessons from evaluations.

The guide offers two toolkits:

  • DCE evaluation bank: examples of primary (assessment/analysis) and supplementary (information gathering) evaluation questions – grouped by lens – and some ‘satisfaction’ questions.
  • Using the lenses in scoping and design: a set of considerations and questions that an evaluator might ask during the scoping and design stages, again grouped by lens.

Source

Digital Engagement Evaluation Team. (2016). Evaluating digital citizen engagement: A practical guide. Washington, DC: World Bank.