Training evaluation is the systematic process of collecting information and using that information to improve your training. Evaluation provides feedback to help you identify if your training achieved your intended outcomes, and helps you make decisions about future trainings.
Evaluation is the final phase in the ADDIE model, but you should think about your evaluation plan early in the training design process. Work with training developers and other stakeholders to identify:
- the evaluation purpose,
- the evaluation questions,
- and the data collection methods.
Your training stakeholders might include the intended audience, organizational leaders, or others with an interest in the training.
An evaluation purpose explains why you are conducting an evaluation. To help shape your evaluation purpose, consider who will use the findings, how they will use them, and what they need to know.
You might use training evaluation findings to:
- Develop a new training
- Improve an existing training
- Provide instructor feedback
- Determine if your training met the desired outcomes
- Make decisions about resource allocation
Evaluation Purpose Examples
- You have an online training, and find that many learners start but do not complete the training. You want to do an evaluation to determine how to improve completions.
- Your program invests heavily in classroom training. You need to know if the trainings are effective to justify the resources your program is using.
Create evaluation questions that match your purpose. Evaluation questions are broad, overarching questions that support your evaluation purpose—they are not specific test or survey questions for learners to answer.
Evaluation questions are often focused in one of two categories: process or outcome.
Process evaluation questions focus on the training itself—things like the content, format, and delivery of the training.
Process Evaluation Question Examples
- To what extent does the training meet CDC’s Quality Training Standards?
- To what extent did the training reach the intended audience?
- How can we make the training more engaging?
Outcome evaluation questions focus on changes in the training participants – things like learning and the transfer of learning. For more information, see Training Effectiveness.
Outcome Evaluation Question Examples
- How much did learners’ knowledge increase?
- To what extent were learning objectives met?
- To what extent did learners apply what they learned when they returned to work after the training?
Choose data collection methods that will help you answer your evaluation questions. Common methods include tests or quizzes, surveys or questionnaires, observation, expert or peer review, and interviews and focus groups. Identify how long it will take to access this data and how often you will collect it. Develop a timeline for when to collect, analyze, and interpret data so that you will have the information ready when you need it.
Keep feasibility in mind when you select data collection methods. The resources, time and effort required in your evaluation plan should match the scope of the training, and should fit within your available resources.
- Basic Principles of Survey Question Developmentexternal icon
- CDC’s Division for Heart Disease and Stroke Prevention Program Evaluation Tip Sheet: Evaluating Training Events pdf icon[PDF – 845 KB]
- Kirkpatrick’s Model of Learning Evaluation pdf icon[PDF – 1.31 MB]external icon
- Learning-Transfer Evaluation Modelexternal icon
- Training Evaluation Framework and Toolsexternal icon
Patton MQ. Utilization-focused evaluation: The new century text. 3rd ed. Thousand Oaks, CA: Sage, 1997.