Practical Evaluation Using the CDC Evaluation Framework—A Webinar Series for Asthma and Other Public Health Programs

The National Asthma Control Program, in partnership with the Environmental Protection Agency, has created a four-part Webinar series on program evaluation basics. Nationally recognized experts present a general introduction to program evaluation; note challenges in conducting useful evaluations as well as methods for overcoming those challenges; and introduce the six steps of the CDC Framework for Program Evaluation using examples that are relevant to state partners of the National Asthma Control Program.

Presented by Tom Chapel, MA, MBA, CDC Chief Evaluation Officer (Acting)

Webinar 1: Top Roadblocks on the Path to Good Evaluation – And How to Avoid Them
Tom Chapel, a nationally recognized evaluation expert, introduces CDC’s approach to program evaluation. After making the case for a utilization-focused evaluation framework, Tom presents some typical challenges programs encounter when trying to do good program evaluation… More »

Tutorial 1A – Focus On: Walking Through the Steps and Standards
Program improvement is at the heart of CDC’s Framework for Program Evaluation. In this tutorial, Tom Chapel describes each of the six steps of the Framework and the four evaluation standards. More »

Presented by Leslie Fierro, MPH, Independent evaluation consultant to the NACP
Carlyn Orians, MA, Battelle Centers for Public Health Research and Evaluation

Webinar 2: Getting Started and Engaging Your Stakeholders
Leslie Fierro and Carlyn Orians describe the initial steps of designing and implementing a program evaluation plan. They discuss… More »

Presented by Tom Chapel, MA, MBA, CDC Chief Evaluation Officer (Acting)

Webinar 3: Describing Your Program and Choosing an Evaluation Focus
Tom Chapel describes the importance of a clear program description in program evaluation and explores the concept and uses of logic models in “describing the program” (Step 2 in the CDC Framework). He then moves to focusing the evaluation design (Step 3),… More »

Tutorial 3A – Focus On: Thinking About Design
In this tutorial, Tom Chapel describes the various evaluation design options, and the strengths and weaknesses of experimental and non-experimental design models. More »

Presented by Dr. Christina Christie, Claremont Graduate University

Webinar 4: Gathering Data, Developing Conclusions, and Putting Your Findings to Use 
Christina Christie covers Steps 4, 5, and 6 in the CDC Framework (gathering evidence, justifying conclusions, and ensuring use). She describes the processes of gathering and using data for program bench-marking, improvement, and accountability. More »

Tutorial 4A – Focus On: Data Collection Choices
In this tutorial, Tom Chapel discusses how to convert evaluation questions into measureable indicators and how those indicators help inform your data collection choices. More »

Tutorial 4B – Focus On: Using Mixed Methods
In some instances, using a single method of inquiry to answer your evaluation questions may result in incomplete or incorrect findings. The “mixed methods” approach, which is a combination of at least one qualitative and one quantitative data collection method, addresses this concern. In this webinar, Tom Chapel provides the rationale for such an approach and describes some of the choices and challenges evaluators face when using this now well-accepted evaluation methodology. The webinar includes several simple examples as well as a discussion of the importance of looking to the evaluation standards for guidance when choosing among data collection options. More »

Presented by Tom Chapel, MA, MBA, CDC Chief Evaluation Officer

Webinar 5: Evaluation Purpose Informs Evaluation Design
Tom Chapel demonstrates how defining an evaluation’s purpose, user, and use helps frame and guide an evaluator’s choices throughout the evaluation. Using the CDC Framework for Evaluation, he shows how an evaluation designed to support program replication varies from one intended to guide program improvement, which, in turn, varies considerably from an evaluation designed to support decisions about a program’s funding. The webinar compares and contrasts three scenarios, showing a wide range of the choices faced in any evaluation. More »