evaluation design

Evaluation Questions

The foundation of good research, including evaluation, is a well-written question in which quantitative and/or qualitative research methods attempt to answer. In the case of evaluation, questions take on different value-oriented frames such as appropriateness, effectiveness, efficiency, impact, or sustainability.

The three resources below and those added to the Twitter discussion capture different qualities related to writing good evaluation questions.

  1. A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions (FSG) This guide offers practical considerations about stakeholders at the beginning of an evaluation as well as the conclusion of an evaluation.
  2. Evaluation Questions Library Guide (CDC)  This library guide aggregates resources that provide (1) an introduction to evaluation questions, (2) developing and using evaluation questions, and (3) linking evaluation questions to approaches, designs, or criteria.
  3. Evaluation Questions Checklist for Program Evaluation from The Evaluation Center at Western Michigan University   This checklist describes ways in which evaluation questions should be (1) evaluative, (2) pertinent, (3) reasonable, (4) specific, (5) answerable, and (6) complete.
data analysis quantitative data

Excel Pivot Tables and Charts

In an effort to explore and better understand different sets of data from several evaluation projects, I finally was moved to learn about the functionality of PivotTables and PivotCharts inside of Microsoft Excel.

Having reviewed many comments for a number of YouTube videos, I selected and watched the following four. Their combined 60 minutes provided a great overview, basic understanding of structuring and using pivot tables, creating pivot charts, and assembling various pivot charts to create a dynamic dashboard able to provide insightful observations of your Excel data. I hope you find them useful and look forward to your feedback.

After viewing these and working with your own project data, use the “Discuss on Twitter” button below to share any other resources you’ve found helpful in your path to learning and using pilot tables, pivot charts, and data slicers.

In this first video, you’ll get a great overview of pivot tables by Kevin Stratvert.
In this second video, you’ll engage in the first of a series of 3 videos by Jon Acampora.
In this third video, you’ll engage in the second of a series of 3 videos by Jon Acampora.
In this fourth video, you’ll engage in the third of a series of 3 videos by Jon Acampora.
data analysis dissemination

Data vs. Information

In a recent technical assistance online meeting, I was talking with several community groups about various data collection and analysis related to their program implementation and intended outcomes. I raised the importance of sifting through all the various data in order to extract actionable information useful for the program staff, community stakeholders, and intended beneficiaries of the program. I was reminded of a past blog post from Seth Godin that I shared with them. Quite succinctly, he reminds us about the importance of disseminating evaluation findings that can both be understood and story-like.

When there’s simply data, it’s all noise. It’s impossible for a human being to absorb data without a narrative.

Once we figure out how to turn your features and ideas and benefits and effort into a story, though, it becomes information. And then we can act on it.

We have a story problem. All of us do. We’re not doing a good job of developing the empathy to turn all the data we’ve assembled into a story that others can understand.
evaluation design programs

Programming and Domains of Evaluation Questions

In Developing Monitoring and Evaluation Frameworks, Markiewicz and Patrick (2016) laid out the relationship of program development and implementation to five domains of evaluation questions. In doing so, they’ve contributed to the discussion in the previous post about what evaluation is for. At the very least, drilling down the domains of evaluation questions alter the types of program decisions to be considered and answered.

The following table, adapted from their work, summarizes the five domains of evaluation questions aligned to five program components.

Program Component Domains of Evaluation Questions Description of Domains
Planning and designAssessing the appropriateness of the program’s design-Suitability of program design in context
-Fit of program with program theory and/or logic
-Testing of underlying assumptions
-Extent program meets the priorities and needs of key stakeholders
Objectives Assessing program effectiveness in meeting it’s objectives, its value, and quality – Fidelity of implementation
-Achievement of program objectives
-Assessment of the quality and value of the program
ImplementationExamining efficiency and fidelity in program implementation-Conversion of inputs to outputs and outputs to results
-Governance and management
ResultsEstablishing impact: intended and unintended, and the degree to which change is attributable to the program-Changes (results) produced by the program, intended and unintended, direct and indirect
Sustainability of resultsIdentifying ongoing sustainable benefits from the program-Continuation of program benefits