Young Breast Cancer Survivors Program Evaluation

In 2019, CDC funded eight organizations for 5 years to provide structured support services and resources for young breast cancer survivors (YBCS) and metastatic breast cancer (MBC) patients. These services and resources are designed to increase their survival and improve their quality of life. These organizations also provide educational resources for health care providers who serve this population.

CDC is conducting an evaluation of the Young Breast Cancer Survivors Program that articulates outcomes and drives program improvement. Results will be used to continue growing the program and identify evidence-based best practices for increasing YBCS’ access to psychosocial support systems, lifestyle programs, clinical preventive services, and cancer care.

This evaluation assesses the extent to which programs—

  1. Have the capacity to provide services and support for both YBCS and MBC patients successfully.
  2. Foster and sustain partnerships that are critical to the implementation of support systems and lifestyle programs.
  3. Achieve the outcomes outlined in the notice of funding opportunity.

Evaluation Framework

Engage Stakeholders, Describe the Program, Focus Evaluation Design, Gather Credible Evidence, Justify Conclusions, and Ensure Use and Share Lessons. Standards: Utility, Feasibility, Propriety, and Accuracy.
Conduct Stakeholder Engagement
  • Meet with CDC to learn about the cooperative agreement and evaluation priorities.
  • Attend or review notes from the kickoff meeting.
  • Convene a meeting with stakeholders.
Describe the DP19-1906 Cooperative Agreement
  • Conduct an environmental scan (document review and key informant interviews) in year 1.
  • Review the DP19-1906 logic model.
  • Develop a brief describing findings of the environmental scan.
Design a Plan for Evaluation, Data Collection, and Data Analysis
  • Develop an evaluation and dissemination plan including an evaluation planning matrix and a data analysis plan.
  • Develop a data collection protocol and instruments.
  • Develop an internal review board package and obtain approval.
Implement Evaluation
  • Review program documents in years 1, 2, and 3.
  • Collect data through interviews and a survey in years 2 and 3.
Synthesize Findings Across Data Sources
  • Conduct a qualitative analysis of the document review and interview data in years 2 and 3.
  • Conduct a quantitative analysis of the survey tool in years 2 and 3.
Disseminate Evaluation Findings with Recommendations for Program Improvement
  • Develop internal briefs in years 2 and 3.
  • Develop a public-facing brief and two sets of conference presentation slides by year 3.
  • Develop the final report and a manuscript for publication in year 3.

Evaluation Design and Methodology

This evaluation uses a mixed methods approach. The evaluation questions inform the data collected in this evaluation. The three data collection activities include—

Annual Document Reviews

Annual document reviews are conducted to—

  • Assess progress on implementation for each program strategy.
  • Determine plans for implementation of each strategy for the following year.
  • Analyze progress on achieving outcomes for each strategy.
  • Identify facilitators and barriers to implementation and evaluation plans for monitoring, evaluation, and dissemination.

Documents in the review include annual progress reports, proposed evaluation plans, evaluation reports, and proposed workplans.

Key Informant Interviews (Years 3 and 5 of the Program)

The program director or manager will be interviewed to gather contextual information regarding the implementation of strategies, the perceived effect of reach and outcomes, facilitators and barriers to implementation and evaluation, and perceptions of program sustainability.

Strategy Inventory

Information on the programs’ activities and collaborations are collected.

Evaluation Data Collection Timeline

An environmental scan and key informant interviews were conducted in year one of the program to guide the overall evaluation design. Below is a projected data collection timeline for the 5-year program.

The graphic above shows when data are collected and submitted—

  • Document reviews are submitted each September (2020, 2021, and 2022).
  • Key informant interviews are completed in May 2021 and July 2022.
  • Data are collected for the strategy inventory from September to December 2020 and submitted in January 2021. In 2021, data are collected in January through June and submitted in July, then collected in July through December and submitted in January 2022. In 2022, data are collected in January through June and submitted in July.

How Evaluation Findings Are Used

  • Facilitate program improvement.
  • Identify lessons learned for future cooperative agreements.
  • Identify promising practices to allow CDC to support YBCS and MBC patients and their families better.
  • Disseminate products that describe program efforts and outcomes.
  • Communicate the effect of policy on YBCS programs.
  • Share data with award recipients for program improvement.

More Information