Evaluating Your AMIGAS Program

What to know

Find out whether AMIGAS is meeting your goals.


Evaluation can help you know whether AMIGAS is meeting your goals. Your evaluation results can also help you make your program better. We encourage you to begin by making sure your organization agrees on what your program is trying to do and how you will reach your goals. You can use words or pictures (or both) to show what you will do to reach your goals. Use your answers to the questions on the Building Support for AMIGAS page to help create your description. Be as specific as you can. That way you will be sure that everybody has the same understanding of what you are trying to do. Encourage community health workers to describe how they will know if they are successful.

If you are not familiar with evaluation, you may want to consult outside resources. You can consult with evaluators within or outside of your organization. A local university may have staff or students who can help you. Many online resources and publications can help you with your evaluation. We have provided a list of resources below.

Questions to answer

After you have a clear program description, you need to decide what questions you want to answer. Your evaluation can be used to answer questions about the implementation of AMIGAS, such as how many women you recruit and refer for screening. The evaluation can also answer questions about the outcomes of your work, such as how many women get a Pap test, an HPV test, or both tests (co-testing).

Data sources

The data you need will depend on the specific questions you want to answer. To answer your questions, you may need to draw data from more than one source. Some data sources may be available already in your organization, such as records on the number of women seen for Pap tests or HPV tests.

Below we describe some of the data sources you may want to use to evaluate your AMIGAS program.

  • Contact sheet. You can summarize the information from these sheets to tell you how many women you recruited, how many women each community health worker reached, and how many women were screened for cervical cancer after participating in AMIGAS. The follow-up notes may provide insights into why some women are not getting screened.
  • Clinic records. To find out how many women have scheduled or received cervical cancer screening, it is better to get the information directly from the clinics that provide these services. You can do this by reviewing clinic records or patient charts. Discuss this with the clinic in advance to be sure you are meeting federal guidelines to protect health information.
  • Participant feedback. You also may have questions about how women responded to the education and support they received. You might also want to know about the experiences of women who decided to get screened for cervical cancer. We have provided a sample evaluation form. You can modify this evaluation form to work for you.
  • Participant survey. You may want to ask questions about women's knowledge, attitudes, and intention to get screened both before and after they participate in AMIGAS. This would help you see whether the program did a good job of sharing information and motivating women.

You may want to consider creating some simple evaluation forms to collect information. To get more in-depth answers, you may want to collect more data through:

  • Field notes from community health workers and interviews with community health workers and their supervisors. Input from community health workers and supervisors may provide an insider's perspective as to why the program is working or not.
  • Interviews with women from the community. You can also gain useful information from women in the community. You may learn specific reasons why they are not getting screened for cervical cancer.

Sample evaluation questions and data sources

Has AMIGAS met our implementation objectives? If "no," why not?

  1. Are we implementing the community health worker training and education that we planned?
    1. Training records from supervisor
  2. Are we reaching who we want to reach?
    1. Clinic records
    2. Contact sheets
    3. Interviews with community health workers
    4. Surveys or focus groups of women from the community
  3. Are we reaching the number of women we planned?
    1. Contact sheets
    2. Interviews with community health workers
    3. Surveys with women from the community
  4. What barriers do women in our community have to getting screened for cervical cancer?
    1. Interviews with community health workers
    2. Participant surveys
  5. Are clinics scheduling women for cervical cancer screening tests?
    1. Clinic records
    2. Participant surveys (is there a disparity between participant reports and clinic records?)

Are women satisfied with their participation in AMIGAS? If "no," why not?

  1. Did women find the educational materials interesting and useful?
    1. Field notes from community health workers
    2. Participant feedback forms
  2. Were women satisfied with the quality of education they received from the community health workers?
    1. Participant feedback forms
  3. Were women satisfied with the facilities and services to support their training?
    1. Participant feedback forms
  4. Would women recommend the training to their family and friends?
    1. Participant feedback forms

Has AMIGAS been successful in reaching our screening objectives? If "no," why not?

  1. Are women who were contacted by community health workers getting screened?
    1. Contact sheets
    2. Clinic records
    3. Participant surveys
  2. Are women getting their test results in a timely manner?
    1. Contact sheets
    2. Interviews with women after receipt of screening
  3. What is the average length of time between the first contact and getting screened?
    1. Contact sheets
    2. Clinic records
    3. Interviews with women after receipt of screening

Using evaluation results to make your program better

Good evaluation data can help you discover and celebrate your successes. Evaluating AMIGAS as you go along can help you change things that are not working.

Here are a few ways you can use your evaluation results:

  • Your evaluation can help you identify and build on your strengths. For example, you may learn that having a booth at community health fairs is a good way to recruit women and promote AMIGAS.
  • You may learn ways to improve how you train community health workers, recruit women, or contact community clinics and members. For example, participant feedback may show that women do not like one of the games. A new and more interesting game can be substituted to increase participant interest.
  • Your evaluation can provide accountability to funders, the community, and other partners. Funding agencies and partners may want to know how effective programs are so they can justify continuing, discontinuing, or expanding funding support to the program.
  • Positive results may increase community awareness of cervical cancer and cervical cancer screening. They can contribute to the scientific base for community public health programs. The findings may help to generate creative ideas for future health promotion activities.

Evaluation resources

Having one or two resources can be helpful in understanding the basic principles of program evaluation. Below we list online resources and books that provide a helpful overview of program evaluation.

  • Introduction to Evaluation (University of Kansas)
  • Chen HT. Practical Program Evaluation: Assessing and Improving Planning, Implementation, and Effectiveness. Sage Publications, Inc., 2005.
  • Fitzpatrick JL, Sanders JR, and Worthen BR. Program Evaluation: Alternative Approaches and Practical Guidelines. 3rd ed. Saddle River, NJ: Allyn & Bacon, 2004.
  • Patton MQ. Utilization Focused Evaluation. 4th ed. Sage Publications, 2008.
  • Rossi PH, Lipsey MW, and Freeman HE. Evaluation: A Systematic Approach. 7th ed. Sage Publications, 2004.
  • Russ-Eft DR and Preskill H. Evaluation in Organizations: A Systematic Approach to Enhancing Learning, Performance, and Change. Basic Books, 2009.
  • Wholey JS, Hatry HP, and Newcomer KE, eds. Handbook of Practical Program Evaluation. 2nd ed. Jossey-Bass, 2010.