Skip directly to search Skip directly to A to Z list Skip directly to site content
CDC Home

A Guide to Developing a TB Program Evaluation Plan

This is an archived document. The links and content are no longer being updated.

Text for PowerPoint Slides

Slide 0: Developing An Evaluation Plan For TB Control Programs

by Division of Tuberculosis Elimination
National Center for HIV, STD, and TB Prevention
Centers for Disease Control and Prevention

Return to Slide Text Table of Contents

Slide 1 Objective: Introduce presentation

Reference: A Guide to Developing an Evaluation Plan

Speaker:

Program evaluation is an essential component of all TB control programs. Evaluation enables us to improve and enhance our programs and better meet our goals for TB elimination. It provides evidence to make good decisions about a program or an initiative and also helps us be accountable to funders – including the CDC, other agencies and organizations.

This presentation describes the steps in developing an evaluation plan. It is based on the Guide to Developing an Evaluation Plan for TB control programs, and used in conjunction with the guide, will help you write a comprehensive and thoughtful evaluation plan. The sections of this presentation correspond to the sections in the evaluation plan guide and the template for an evaluation plan that is part of the guide.

I hope that you will find these resources helpful in evaluation planning. If you need or want more information, there are many resources for evaluation available. At the end of the presentation, I will provide a list of a few important resources.

Return to Slide Text Table of Contents

Slide 2 Objective: Explain the importance of writing an evaluation plan

Why Develop an Evaluation Plan?

  • Provides a cohesive approach to conducting evaluation and using the results
  • Guides evaluation activities
  • Explains what, when, how, why, who
  • Documents the evaluation process for all stakeholders
  • Ensures implementation fidelity

Speaker:

Why develop an evaluation plan? A plan serves as a guide to conducting a program evaluation. It breaks evaluation planning into manageable steps with each new step building on the preceding steps. As you work through the plan, you will notice that the earlier steps in developing the plan are the most time consuming but later steps rely on them and flow more easily.

Although developing an evaluation plan may seem time consuming, it typically doesn’t take a lot of time but it serves a number of important purposes. The plan:

  • Lays out a cohesive approach to conducting evaluation and using the results. This ensures the successful execution the evaluation. Resources are spent wisely and findings are used for their intended purpose.
  • Guides the evaluation activities by explaining what they are, when they should occur, how the activities will be accomplished and who is responsible for completing them
  • Documents the evaluation process for all interested parties (referred to as stakeholders) so that the evaluation is transparent, and
  • Ensures implementation fidelity of the evaluation because the process is clearly laid out and the evaluation activities are easy to follow and implement.

By creating a detailed plan for the evaluation, you can ensure that valuable resources are not wasted on conducting an evaluation that does not produce findings that can help improve TB prevention and control activities.

Return to Slide Text Table of Contents

Slide 3 Objective. Introduce Guide

Guide to Developing An Evaluation Plan

  • Document referenced throughout presentation
  • Provides a template and instructions to help TB program staff develop an evaluation plan
  • Steps to evaluation are explained in detail
  • Completing sections and tables will result in an evaluation plan

Speaker:

This presentation explains how to develop an evaluation plan. The components of the presentation are based on “The Guide to Developing an Evaluation Plan for TB Programs” and is referenced throughout the presentation. It includes explanations about how each section of the evaluation plan plus an evaluation plan template and instructions on how to complete the template. Completing each section of the template and filling in the tables results in a final evaluation plan.

The Guide was developed as a tool for TB control programs to help them develop their plans. The content draws from many evaluation resources and is based on the CDC Program Evaluation Framework that has been well-researched and tested in the field.

The guide and template were developed based on input from many evaluators, the TB Evaluation Working Group and TB program staff. In addition, we have piloted it with a small group of states by asking them to develop their evaluation plan based on it.

In addition to these instructions, other documents will be available to help you including a sample evaluation plan.

Return to Slide Text Table of Contents

Slide 4 Objective: Introduce CDC evaluation framework as the basis for guideline

The CDC Program Evaluation Framework

Steps: Engage stakeholders, Decribe the program, Focus the evaluation design, Gather credible evidence, Justify conclusions, and Ensure use and share lessons learned.

Standards: Utility, Feasibility, Propriety, and Accuracy.

Speaker:

The approach to evaluation used in this presentation and the guide is based on the CDC Framework for Program Evaluation in Public Health (MMWR, 1999). The framework illustrated in the slide includes six steps that will enable you to systematically conduct an evaluation of your program that provides useful information. In the center, the standards for good evaluation are listed – utility, feasibility, propriety and accuracy.

Note that the framework is circular, illustrating that the use of evaluation findings are as important as conducting the evaluation. The circle also illustrates the iterative and evolving nature of the evaluation.

We are suggesting this framework and providing guidance on it. However, programs are not required to use this framework as long as you provide the same information in whatever format you choose.

Return to Slide Text Table of Contents

Slide 5 Objective: Explain the advantages of using the CDC Evaluation Framework

The CDC Program Evaluation Framework

  • Systematic method for evaluation
    • Based on research and experience
    • Flexible and adaptable
  • Promotes a participatory approach
  • Focuses on using evaluation findings

Speaker:

To assist programs in obtaining credible information on program effectiveness, CDC developed this framework that describes a way to collect and analyze evaluation data and evaluate programs. The framework has a number of advantages.

It offers a systematic method to ensure that your evaluation will yield results that stakeholders will use to improve the program. Even small TB programs are complex and have many dimensions. The framework provides a disciplined way to ensure that you consider the appropriate factors when identifying your problems – or your successes. It also ensures that your evaluation will provide sufficient information to use the findings to improve your program.   

The Framework is based on sound research and experience in the field. Each of the steps in the framework can be adapted and tailored to your program. The CDC framework is able to accommodate differences in size and purpose of programs and evaluations. Your evaluation may be quite elaborate or it may be simple and focused on one evaluation question.

It may seem like many steps are need to complete an evaluation plan, but the process needn’t be time consuming. For small scale evaluations, you can complete any given step in an hour or so. You will also find that your evaluation plan can be adjusted and tailored over time using the framework.

The framework supports a participatory process that engages stakeholders in all stages of the evaluation. Finally, by using the framework you can ensure that the findings are used for the purpose that they were intended.

Return to Slide Text Table of Contents

Slide 6 Objective: Introduce different sections of an evaluation plan

Sections of an Evaluation Plan

  • Introduction
  • Stakeholder Assessment
    • Step 1: Engage Stakeholders
  • Background and Description of the TB Program and Program Logic Model
    • Step 2: Describe the Program
  • Focus of the Evaluation
    • Step 3: Focus the Evaluation Design

Speaker:

This slide and the next show the major sections that should be part of your evaluation plan. These sections map onto the steps of the framework we just discussed. I’m going to briefly introduce the sections now and then explain each of them in greater detail over the remainder of the presentation.

Introduction: Although not part of the framework, an evaluation plan needs an introduction to present an overview of the evaluation.

Stakeholder Assessment: is the first step in the framework when you discuss how you will Engage Stakeholders. Stakeholders are those people who have a vested interest in the success of your TB program.

Background and Description of the TB Program and the Program Logic Model is the second step in the framework - ‘Describe the Program.’ Here you provide background information about your program, describing what you do in your TB program and how you are working to achieve your goals. The logic model is a graphic representation of your program.

Focus of the Evaluation is where you tailor your evaluation to your stakeholders’ needs and identify specific questions to answer with your evaluation.

Return to Slide Text Table of Contents

Slide 7 Objective: Introduce different sections of an evaluation plan (continues)

Sections of an Evaluation Plan

  • Gathering Credible Evidence: Data Collection
    • Step 4: Gather Credible Evidence
  • Justifying Conclusions: Analysis and Interpretation
    • Step 5: Justify Conclusions
  • Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination
    • Step 6: Ensure Use and Share Lessons Learned

Speaker:

Gathering Credible Evidence: Data Collection corresponds to step 4 of the framework. This is where you describe how you will collect the data for your evaluation.

Justifying Conclusions: Analysis and Interpretation is step 5 of the framework called Justify Conclusions. Here you explain how you will analyze the evaluation data and determine what it means to your program.

Ensuring Use and Sharing Lessons Learned: corresponds to step 6 of the framework. In this section, you describe how you will share your evaluation findings with your stakeholders, and encourage them to use the information to improve the program.

The remaining slides address each section of the evaluation plan in more depth.

Return to Slide Text Table of Contents

Slide 8 Objective: Explain the Introduction section

Introduction

An introduction provides background information, identifies the purpose of the evaluation, and provides a roadmap of the plan.

  • Evaluation Goal
    • What is the purpose of the evaluation?
  • Evaluation Team
    • Who is your evaluation coordinator?
    • Who are the members of your evaluation team?

Reference: Table 1 in the Evaluation Plan Guide

Speaker:

An evaluation plan begins with an introduction that gives background information important to the evaluation, identifies the purpose and the goal of the evaluation, and provides a roadmap of the document to the reader.

The first part of the introduction explains the Evaluation Goal that is an overarching statement that explains why the evaluation is taking place.

Examples of Evaluation goals include:

  1. Determining the effectiveness of the program, or a component of the program.
  2. Investigating portions of the program that are performing optimally so that they can be replicated, or suboptimally so that they can be addressed, and
  3. Redistributing resources in an equitable manner

The plan introduction also identifies the individuals (or types of individuals) who will be involved in the evaluation and their roles and responsibilities. These include:

An Evaluation Coordinator who is the evaluation leader or the person who is responsible for oversight of the evaluation.

An Evaluation Team are the individuals who are part of the evaluation such as program staff, partners or advisors. The team may include people responsible for some aspect of data collection or analysis or for dissemination and use of the findings.

In the guide, Table 1 can be completed and used in your evaluation plan to specify your evaluation team and their roles and responsibilities.

Return to Slide Text Table of Contents

Slide 9 Objective: Explain the Stakeholder assessment section

Stakeholder Assessment

Stakeholders are individuals with vested interests in the success of the TB program. Involving stakeholders increases the credibility of the evaluation and ensures that findings are used as intended.

  • Who are the stakeholders in your TB program?
  • What are their interests in the evaluation?
  • What role do they play in the evaluation?
  • How do you plan to engage the stakeholders?

Reference: Table 2 in the Evaluation Plan Guide

Speaker:

Stakeholder assessment is the first step in evaluation. Stakeholders can be divided into 3 major categories:

  1. those involved in program operations such as managers and staff,
  2. those who are served or affected by the program such as patients and the community, and
  3. the primary users of the evaluation findings such as directors, funders, or administrators who can help change and improve the program.

Your program may have many stakeholders but not all are equally important in the evaluation. The level of stakeholder involvement will vary among different program evaluations but priority stakeholders include those who can increase the credibility of the evaluation efforts, who are involved in the implementation of the program, who will advocate or authorize changes to the program, or will fund or authorize the improvements to the program.

Involving stakeholders is not difficult or time consuming but it brings in multiple perspectives on your program and on the evaluation. It assures that everyone’s interests and values are reflected in the final results.

Worksheet 1 in the guide helps you identify your stakeholders and consider their interests and roles in the evaluation as well as how to engage them. Table 2 in the guide helps you summarize the stakeholder information.

Return to Slide Text Table of Contents

Slide 10 Objective: Explain the purpose and components of the background, description and logic model section

Background and Description of the TB Program

The program description ensures that stakeholders have a shared understanding of the program and identifies any unfounded assumptions and gaps.

  • Need
    • What problem does your program address?
    • What are the causes and consequences of the problem?
    • What is the magnitude of the problem?
    • What changes or trends impact the problem?

Speaker:

Every TB program is different and even programs that have a lot in common have to tailor the program to meet local needs. In this section of the plan, you will describe your program and what makes it unique among TB programs. There are three primary reasons for the description. First, by systematically describing your program, you ensure that all evaluation stakeholders have a shared understanding of your program. Second, a program description will clarify any assumptions that people may have about the program and its operations. Finally, the project description naturally leads into the next section of the evaluation plan – the program logic model because the logic model is a graphic representation of your program description.

The program description includes several factors that you need to make an assessment of and include in your plan. The first is need for the program.

In the plan state the special needs of your community that your TB program responds to. The need is often based on local epidemiology and trends in TB. To help you identify the need your program meets, consider the four questions asked here.

Return to Slide Text Table of Contents

Slide 11 Objective: Explain the purpose and components of the background, description and logic model section (continues)

Background and Description

  • Context
    • What are environmental factors that affect your program?
  • Target Population
    • Does your program target the TB concerns of one population?
  • Program Objectives
    • What objectives have been set for your program?
  • Stage of Development
    • Is this a new initiative or is it well established?

Speaker:

Context considers the environment in which your program operates. It includes: how the program is administered, how it fits in with other health and social services in the community, and the structural, political or policy environment that surrounds your program.

The program description continues with identifying a target population – this is the group that your program targets in addressing TB concerns. For example, they may be newly arrived immigrants, people who are HIV-positive, people who abuse substances, children and people living in at-risk housing.

You’ll also need to include your program’s objectives that relate back to your program’s overarching goal. Frequently, your program’s objectives are national or state TB objectives, but your program may have its own.

Example Objectives: 85% of patients identified with TB will be placed on DOT within 3 months of diagnosis. Another objective is Increase the utilization of service by 60% by Fall 2006.

There are a number of existing objectives that are available for you to use but you can also write your own.

Assessing the stage of development of the program or a program component or initiative that you’re evaluating will help you frame your evaluation and write your evaluation questions. Note that components of programs can be in different stages of development. Knowing the stage of development of your program components or initiatives will help you decide on your evaluation questions.

Return to Slide Text Table of Contents

Slide 12 Objective: Explain the purpose and components of the background, description and logic model section (continues)

Background and Description

  • Resources
    • What resources are available to conduct the program activities?
  • Activities
    • What are program staff doing to accomplish program objectives?
  • Outputs
    • What are the direct and immediate results of program activities (materials produced, services delivered, etc.)?
  • Outcomes
    • What are the intended effects of the program activities?

Reference: Table 3 in the Evaluation Plan Guide

Speaker:

Describe the resources available to implement your program. Resources can include TB program staff, funding, physical components of the program such as space or computer resources, and resources that are part of the larger agency that your program belongs to such as the state health department. Partners and community organizations are also resources.

Activities are what your program staff is doing to accomplish the program objectives. This might include hiring and training staff, policy development, providing TB testing, and providing education to patients or the community. Activities may be sequential with initial activities that must be completed before subsequent activities. For example, providing education to patients is dependent on the development of education materials.

Outputs are the direct and immediate results of program activities. They assess whether an activity has occurred but are not indications of your program’s effectiveness. They can include strategic plans, treatment protocols, the number of tests conducted, and the number of providers who attend a TB education program.

Outcomes are the intended effects of your program’s activities and outputs. Outcomes are the changes you want to see in patients, providers or the community. An examples would be reducing stigma about TB in the community or eliminating TB in a target population.

For example, if one of your planned program activities is to hire and train lay health advisors (LHAs), then an output would be the number of LHAs hired and trained. The outcome of this activity might be that culturally competent services are now being provided for a Spanish speaking community.

Table 3 can be completed to organize your program description and help you develop your logic model.

Return to Slide Text Table of Contents

Slide 13 Objective: Describe logic modeling

Program Logic Model

A logic model is a graphic depiction of the program description.

  • Arrows describe the links between resources, activities, outputs and outcomes
  • A logic model
    • Provides a sense of scope of your program
    • Ensures that systematic decisions are made about what is to be measured
    • Helps to identify and organize indicators

Speaker:

Part of the program description is a logic model. You can develop a logic model for your program or one of its components, using the resources, activities, outputs and outcomes identified in the program description.

A logic model serves a number of purposes. It describes the scope of your program and identifies its components. It provides a ‘map’ that ensures systematic decisions about your evaluation, and helps to identify and organize indicators that will guide measurement in your evaluation.

There are no “right” or “wrong” logic models, but the model must show the complete paths linking resources and activities to outcomes. You can also have several logic models for your program – you may have a “big picture” logic model and others that are focused on one component or one initiative. But they all need to show the “causal chain” of how your program is reaching the overarching program goal.

Return to Slide Text Table of Contents

Slide 14 Objective: Describe logic modeling (continues)

Program Logic Model

Resources, Activities, Outputs, and Outcomes

Speaker:

This is a basic picture showing how the elements of the program description fit together to create the logic model. As you can see, it shows a “first this” “then this” sequence. Your logic model needs to show this type of causal sequence to illustrate how your program activities affect the outcomes through the outputs.

CDC and the TB Evaluation Working Group have developed logic models for high priority TB program activities and these are available to help you. It may be useful to review these models prior to developing your own to see if one of them will describe your program with minor modifications. The six TB logic models are an attachment to the guide.

Slide 15 Objective: Provide an example of TB logic model

Speaker:

This is an example of one of the TB logic models developed by CDC. This one is for Contact Investigations.

The additional logic models that have been developed include:

  • Meta-model for TB elimination (encompassing many activities and outcomes that could be logic models on their own).
  • Capacity and infrastructure to eliminate TB
  • Evaluation capacity building
  • Completion of therapy
  • Preventing TB in high risk populations

You may find that one of these will work well for your program with minor adjustments.

This model shows how the pieces of this program initiative fit together as well as the causal sequence of events. All logic models don’t have to be this complex – you can use simple columns, or any other picture that accurately illustrates your program.

Return to Slide Text Table of Contents

Slide 16 Objective: Illustrate a table logic model

Speaker:

This slide illustrates a table logic model that you can also use. This one shows the resources, activities, outputs and outcomes in an initiative to eliminate TB in the Salvadoran community.

The table logic model can be very simple to put together by adapting Table 3 of the guide into a logic model. However, table logic models are also challenging because it’s difficult to show the sequence of events. For example, we’ve inserted an arrow in the bottom rows because there’s multiple paths from TB screening and testing that are hard to show in this type of logic model.

Once you complete the logic model for your program or a component of your program, include it in the evaluation plan.

Return to Slide Text Table of Contents

Slide 17 Objective: Explain the focusing the evaluation section

Focus of the Evaluation

Since you cannot feasibly evaluate everything, you must focus the evaluation by prioritizing and selecting evaluation questions.

  • Stakeholder Needs
    • Who will use the evaluation findings?
    • How will the findings be used?
    • What do stakeholders need to learn/know from the evaluation?

Speaker:

Focusing the evaluation is the next section of your plan. Typically, people want to know many, many things about their program. But it is not feasible or useful to evaluate everything. Thus, focusing your evaluation and selecting your evaluation questions are important steps. The evaluation questions that you identify after focusing the evaluation will guide the next steps of design and data collection.

To focus your evaluation and select your primary evaluation questions, you need to first consider your stakeholders’ needs. Think back to the first step in the plan when you identified key stakeholders and what they wanted to know from the evaluation. Then use your program logic model to identify where the information the stakeholders want to know is located in the program, and develop evaluation questions based on this information.

Evaluation questions also need to provide information you can use to meet the goal of your evaluation as well as your stakeholders’ needs.

Return to Slide Text Table of Contents

Slide 18 Objective: Explaining process and outcome evaluation

Focus of the Evaluation

  • Process Evaluation
    • What resources were required?
    • What program activities were accomplished?
    • Were they implemented as planned?
  • Outcome Evaluation
    • Is the program producing the intended outcomes?
    • Is there progress toward program objectives and goals?

Speaker:

Part of focusing your evaluation is considering whether you will conduct process and outcome evaluation. Process evaluation is useful in nearly all cases but outcome evaluations are not always appropriate. As mentioned earlier, an initiative that is early in implementation may not have the data needed for outcome evaluation.

Process evaluation answers questions that relate to resources, activities and outputs and how these work together to support the desired outcomes. Resources, activities and outputs are the first 3 major components of the logic model.

Outcome evaluation, as it sounds, looks at changes in the desired outcomes. Are the program activities and outputs resulting in the intended outcomes? And what changes are occurring in the outcomes?

Return to Slide Text Table of Contents

Slide 19 Objective: Explain the focus of the evaluation section (continue)

Focus of the Evaluation

  • Evaluation Questions
    • Based on the needs of your stakeholders
    • Address process and outcome
  • Assess Your Questions
    • Feasible to collect
    • Provide accurate results

Speaker:

At this point in planning, you should be able to write 3-5 high priority evaluation questions. Generally questions should reflect both process and outcome to provide you with sufficient information to actively improve or enhance your program.

Example process questions include: Have community partners been properly engaged to prevent TB in high-risk populations? Have community providers been appropriately trained regarding TB? Did our program identify contacts of TB cases?

Example outcome questions include: What percentage of contacts complete treatment for LTBI? Has the incidence of TB decreased in the target population?

One further consideration in selecting evaluation questions is to consider if they are feasible to collect and if the results intended to answer the questions are accurate and reliable. Resources are also a consideration. These factors may eliminate an otherwise good evaluation question.

The evaluation questions can be further reduced or refined by design and data collection factors that I will talk about next.

Return to Slide Text Table of Contents

Slide 20 Objective: Explore issues for consideration in evaluation design selection

Focus of the Evaluation

  • Key Issues in Evaluation Design
    • Will you have a comparison or control group?
    • When will you collect data?
    • Will the data be collected retrospectively or prospectively?
    • What type of data do you need?
    • What data do you have already?

Speaker:

The next section of the plan describes the evaluation design and data collection methods that you will use in the evaluation. Although program evaluation is geared to answering specific questions for specific programs, the designs for answering them often resemble research designs. But it is important to remember that the purpose of evaluation is to improve programs, not to publish generalizable findings, and therefore you need only collect data sufficient to answer your evaluation questions.

There are a number of key issues to consider when selecting an evaluation design:

  1. Will you have a control or comparison group?
  2. Will you collect data at multiple points (e.g., before and after an intervention), just once or at regular intervals during the evaluation?
  3. Will the data be collected prospectively or retrospectively? For example, a chart review of past cases is retrospective. A review of all chart entries forward from a particular date or event is prospective.
  4. Do you need in-depth, detailed information using qualitative methods such as interviews or focus groups, or do you need specific, targeted information collected using a quantitative method such as a questionnaire?
  5. What data do you already have that can be used in the evaluation? Existing sources of data that are collected for another purpose that can be used in the evaluation are an important resource to consider. This kind of data includes ARPE data, case management reports, and patient tracking data.

Return to Slide Text Table of Contents

Slide 21 Objective: Explore issues for consideration in evaluation design selection

Focus of the Evaluation

  • Other Design Considerations
    • Standards for “good” evaluation
    • Timeliness
    • Stage of development
    • Data needed
  • Strengthen Your Design
    • Mix methods whenever possible
    • Use repeated measures
    • Triangulate

Speaker:

In addition to the key issues noted on the previous slide, other considerations are helpful in developing your evaluation design:

Standards for “good” evaluation - You need to select a design that can answer the stakeholders questions but consider the standards of utility, feasibility, propriety and accuracy.

Timeliness – Will the evaluation be completed in a timely manner – or within the time needed by your stakeholders?

Consider the stage of development - if the program component or initiative you are evaluating is new, you may not be able to accurately measure outcomes so a design that assesses outcomes is unnecessary.

Data needs – And finally, what additional data are needed to answer your research questions?

Strengthening Your Evaluation Design:

Whatever you choose, remember that no design is perfect and each method for collecting data has limitations. To strengthen the design, consider doing the following.

Mixing methods – means using both qualitative and quantitative methods to collect evaluation data such as using focus groups and questionnaires to answer an evaluation question.

Repeating Measures – means that you measure the same concept more than once with the same method, allowing you to gain confidence that your findings are reliable

Triangulation – refers to a method of establishing the accuracy of information by comparing three or more independent points of view on data sources bearing on the same findings. For example, interviews, observations, and documentation looking at treatment plans.

Return to Slide Text Table of Contents

Slide 22 Objective: Explain indicator and data collection section

Gathering Credible Evidence: Data Collection

Identify indicators, standards, and data sources to address evaluation questions.

  • Indicators
    • Visible, measurable signs of program performance
    • Reflect program objectives, logic model and evaluation questions
  • Program Benchmarks and Targets
    • Reasonable expectations of program performance
    • Benchmarks against which to measure performance

Reference: Table 4 in your Evaluation Plan Guide

Speaker:

This section is where you identify and gather data to answer your evaluation questions.

The first step is to identify indicators that are tied to your evaluation questions, program objectives and your logic model. Indicators are the visible, measurable signs of your program’s performance. A list of standard indicators is available as an appendix to the guide. Also, the forthcoming TB Evaluation Handbook provides examples of indicators as well as a method to help you write your own indicators. You can write your own indicators but it is important to use the instructions provided.

Indicators should be related to both process and outcomes. For example,

Process indicators are related to services provided, resources, coalition and partnership activities, program implementation & fidelity

Outcome indicators are related to short and long-term outcomes such as changes in behavior; changes in trends

The next step is to identify program benchmarks and targets. These are reasonable expectations of your program’s performance and they will help you define your program’s success.

Standards come from a number of sources. They can be set by your management team or they may be implicit in your program’s strategic plan, the TB National Guidelines or in treatment protocols. They can also be set by your stakeholders and you may learn this during your stakeholder assessment. Later, you will use them to evaluate your program’s performance after you collect and analyze your evaluation data.

Return to Slide Text Table of Contents

Slide 23 Objective: Example of Table 4 from the Guide

Evaluation Question:

  • Have Spanish-speaking persons been treated appropriately for LTBI or TB?

Process and Outcome Indicators:

  • Number of Spanish-speaking persons treated by clinic for TB & LTBI between Jan – June.
  • Number of times clinical treatment standards are met for Spanish-speaking patients
  • Percent of time that signs and forms are available in Spanish and written for persons with low-literacy skills

Program Benchmarks:

  • Increase in the number of Spanish-speaking patients
  • Clinical standards are met 100% of time
  • Patient education signs and forms in Spanish are available 100% of time; literacy level of materials is at a 3rd grade reading level

Speaker:

In the evaluation plan, state your evaluation questions, the indicators that relate to the question, and the standards that are related to the indicators. Table 4 in the guide can be used to link your evaluation questions, indicators and program standards. Using this format, it’s easy to link these three important elements.

This is a simplified example of Table 4 from the Guide and links indicators and standards to the question “Have Spanish-speaking persons been treated appropriately for LTBI and TB?”

Not that you will probably have more than one indicator for each evaluation question.

Return to Slide Text Table of Contents

Slide 24 Objective: Explain data collection section

Gathering Credible Evidence: Data Collection

  • Data Collection
    • Where are the data?
    • What methods will be used to collect data?
    • How often will the data be collected?
    • Who will collect the data?
  • Tools for Data Collection
    • Collect only the information you need
    • Easy to administer and use

Reference: Table 5 in your Evaluation Plan Guide

Speaker:

Now that you have evaluation questions and indicators, you need to collect the necessary data to measure the indicators and answer your evaluation questions. More than one data collection method or source may be linked to each indicator. Your plan should explain how you will collect the necessary data for each indicator and address each of the following questions.

  • Where can data be found, if existing?
  • What methods will you use to collect the data? There are many methods you can use to collect new data including observations, questionnaires, interviews or focus groups.
  • How often will the data be collected?
  • Who is responsible for collecting the data?

After identifying data collection methods or sources, you need to develop or identify tools. Tools are the instruments, documents or strategies that you will use to collect the data. They include questionnaires, abstraction forms, observations forms or log books.

Return to Slide Text Table of Contents

Slide 25 Objective: Example of Table 5

Gathering Credible Evidence: Data Collection

Linking indicators and data sources and specifying your data collection plan. Example from the Guide – Table 5.

Speaker:

It is important to link indicators to data sources, and then provide information about your data collection plan including who is responsible for collecting the data, when will it be collected and what methods you will use to collect the data. Table 5 in the guide will help you do this and record it in your evaluation plan.

This is a simplified version of Table 5 from the guide. Like Table 4 where there were multiple indicators for each question, there is likely to be multiple sources of data for each indicator.

Slide 26 Objective: Explain data collection section

Gathering Credible Evidence: Data Collection

  • Human Subjects Considerations
  • Evaluation Timeline
    • Ensures that all stakeholders are aware of what activities are occurring at any time
    • Helps to determine if your evaluation resources will be strained by too many activities happening at once
  • Data Management and Storage
    • Ensures confidentiality and data quality

Reference: Table 6 in your Evaluation Plan Guide

Speaker:

Human Subjects Consideration – at this point in planning you need to consider if your evaluation will require review by your program’s Institutional Review Board (IRB). Many program evaluations are exempt from review but this is an important consideration when developing your plan.

The final two components of this section – a timeline and a data management and storage plan are optional but useful components of your evaluation plan. Developing a timeline will ensure that all stakeholders are aware of what activities are occurring at any time. The timeline may also help you determine if your evaluation resources will be strained by too many activities happening at any given time.

Whether you include a data management and storage plan in your evaluation plan or not, they are important considerations. Ask yourself where you will keep the data? And who will have access to it? Proper handling of your data protects informant confidentiality and ensures high quality data.

Return to Slide Text Table of Contents

Slide 27 Objective: Explain the components of Analysis and interpretation section

Justifying Conclusions: Analysis and Interpretation

Once the data are collected, analysis and interpretation will help you understand what the findings mean for your program.

  • Analysis
    • What analysis techniques will you use for each data collection method?
    • Who is responsible for analysis?
  • Interpretation
    • What conclusions will you draw from your findings?
    • How will you involve stakeholders?

Reference: Table 7 in your Evaluation Plan Guide

Speaker:

When you select your data collection methods, you also select your analysis methods. For data that can be quantified you can look at frequencies and counts. If you have sufficient data, statistical techniques can be used. For qualitative data, such as responses to interview questions or observations, you can look for patterns or trends with content analysis.

In your evaluation plan, identify the data collection methods, analysis techniques and who is responsible for the analysis. You can use table 7 in the guide to help you organize this.

Once you have the data analyzed, you – and your key stakeholders -- will judge your findings against the program benchmards or targets. When you are doing this, it is important to consider the context that your program operates in. Be sure that your interpretations are sound, reasonable and objective. Involving stakeholders in interpretation will help you accomplish this because they can bring insights and explanations to the evaluation findings. Stakeholder can also help develop practical recommendations that can be implemented by the TB program.

In the evaluation plan, explain who will be involved in interpreting the findings and describe the procedures and guidelines you will use to help you interpret the evaluation findings.

Return to Slide Text Table of Contents

Slide 28 Objective: Explain Dissemination and use section

Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination

A plan for dissemination and use of the evaluation findings will avoid having evaluation reports “sit on the shelf.”

  • Dissemination
    • What medium will you use to disseminate findings?
    • Who is responsible for dissemination?
  • Use
    • How, where, and when will findings be used?
    • Who will act on the findings?

Reference: Table 8 in your Evaluation Plan Guide

Speaker:

The purpose of program evaluation is to use the findings to address the goal of the evaluation. An evaluation does not achieve its purpose if people who need to know about the findings do not know. A reporting and dissemination strategy will ensure that evaluation findings will be distributed to those who will make use of the findings from the evaluation. Your evaluation plan should describe what medium you will use to disseminate the evaluation findings, who is responsible for disseminating the findings, how the findings will be used and who will act on the findings.

In writing this section of your plan, check the reporting and dissemination plan against the stakeholder needs you identified earlier to ensure that your reports will address those needs, and that the reports will reach them. Although not strictly part of your evaluation plan, you may also want to develop a monitoring plan to ensure that findings are used and changes implemented.

Use Table 8 to depict your dissemination plan.

Return to Slide Text Table of Contents

Slide 29 Objective: Provide TIPs for planning evaluation

Tips for Evaluation Planning

  • Start small – focus on one initiative or program component to start with and limit the number of evaluation questions
  • Use what you already know about the program
  • Consider existing sources of data
  • Be realistic in your timeline and assessment of resources
  • Use the template and tables provided in the guide, adapt as needed
  • Seek help with your evaluation

Speaker:

Evaluation planning can seem like an intimidating task. But in reality it’s pretty easy. Our objective in developing the guide and this presentation is to help you through the planning process. The tips on this slide will help make planning a more manageable activity.

Start small – a large, complex evaluation initially may overwhelm you and your resources for evaluation. Focus your evaluation on one component or one evaluation question.

Use what you know about your program. A lot of the information you need is easily accessible to you. But don’t make too many assumptions about your program – part of the evaluation process is identifying unfounded assumptions.

Use existing sources of information for your evaluation. Programs collect a lot of data for other purpose that are useful in evaluation.

Be realistic in your assessments of resources and your timeline to help you avoid shortcomings in the middle of the evaluation.

Use the evaluation guide, template and tables and adapt them as needed.

Seek help if you need it. The next slide lists some evaluation resources but there are plenty more that you can find with a short search.

Return to Slide Text Table of Contents

Slide 30 Objective: Provide resources for evaluation and Q&A

Evaluation Resources

Some Web-Based Resources

  • Centers for Disease Control and Prevention
  • W.K. Kellogg Foundation
  • University of Wisconsin Extension
  • Selected Publications
  • Connell JP, Kubisch AC, Schorr LB, Weiss, CH. New Approaches to Evaluating Community Initiatives, New York, NY: Aspen Institute, 1995.
  • Patton MQ, Utilization-focused Evaluation, Thousand Oaks, CA: Sage Publications, 1997.
  • Rossi PH, Freeman HE, Lipsey MW. Evaluation: A Systematic Approach. Newbury Park, CA: Sage Publications, 1999.
  • Taylor-Powell E, Steele S, Douglas M. Planning a Program Evaluation. Madison, Wl: University of Wisconsin Cooperative Extension, 1996.

Speaker:

Here are some references to find additional information about evaluation. And remember that you will have support from CDC in putting your evaluation plan together. Staff are available to help you and additional evaluation products are forthcoming.

Also remember that your evaluation plan is a flexible, living document that can change with new information. And remember that it doesn’t have to be done every year – just adjust what needs to change.

Return to Slide Text Table of Contents

 

 
USA.gov: The U.S. Government's Official Web PortalDepartment of Health and Human Services
Centers for Disease Control and Prevention   1600 Clifton Road Atlanta, GA 30329-4027, USA
800-CDC-INFO (800-232-4636) TTY: (888) 232-6348 - Contact CDC–INFO
A-Z Index
  1. A
  2. B
  3. C
  4. D
  5. E
  6. F
  7. G
  8. H
  9. I
  10. J
  11. K
  12. L
  13. M
  14. N
  15. O
  16. P
  17. Q
  18. R
  19. S
  20. T
  21. U
  22. V
  23. W
  24. X
  25. Y
  26. Z
  27. #