Young Breast Cancer Survivors Program Evaluation

In 2019, CDC funded eight organizations for 5 years to provide structured support services and resources for young breast cancer survivors (YBCS) and metastatic breast cancer (MBC) patients. These services and resources are designed to increase their survival and improve their quality of life. These organizations also provide educational resources for health care providers who serve this population.

CDC evaluated the Young Breast Cancer Survivors Program to articulate outcomes and drive program improvement. Results will be used to grow the program and identify evidence-based best practices for increasing YBCS’ access to psychosocial support systems, lifestyle programs, clinical preventive services, and cancer care.

This evaluation assessed the extent to which programs—

  1. Have the capacity to provide services and support for both YBCS and MBC patients successfully.
  2. Foster and sustain partnerships that are critical to the implementation of support systems and lifestyle programs.
  3. Achieve the outcomes outlined in the notice of funding opportunity.

Evaluation Framework

Engage Stakeholders, Describe the Program, Focus Evaluation Design, Gather Credible Evidence, Justify Conclusions, and Ensure Use and Share Lessons. Standards: Utility, Feasibility, Propriety, and Accuracy.
Conduct Stakeholder Engagement
  • Meet with CDC to learn about the cooperative agreement and evaluation priorities.
  • Attend or review notes from the kickoff meeting.
  • Convene a meeting with stakeholders.
Describe the DP19-1906 Cooperative Agreement
  • Conduct an environmental scan (document review and key informant interviews) in year 1.
  • Review the DP19-1906 logic model.
  • Develop a brief describing findings of the environmental scan.
Design a Plan for Evaluation, Data Collection, and Data Analysis
  • Develop an evaluation and dissemination plan including an evaluation planning matrix and a data analysis plan.
  • Develop a data collection protocol and instruments.
  • Develop an internal review board package and obtain approval.
Implement Evaluation
  • Review program documents in years 1, 2, and 3.
  • Collect data through interviews and a survey in years 2 and 3.
Synthesize Findings Across Data Sources
  • Conduct a qualitative analysis of the document review and interview data in years 2 and 3.
  • Conduct a quantitative analysis of the survey tool in years 2 and 3.
Disseminate Evaluation Findings with Recommendations for Program Improvement
  • Develop internal briefs in years 2 and 3.
  • Develop a public-facing brief and two sets of conference presentation slides by year 3.
  • Develop the final report and a manuscript for publication in year 3.

Evaluation Design and Methodology

This evaluation used a mixed methods approach. The evaluation questions informed the data collected in this evaluation. The three data collection activities included—

Annual Document Reviews

Annual document reviews were conducted to—

  • Assess progress on implementation for each program strategy.
  • Determine plans for implementation of each strategy for the following year.
  • Analyze progress on achieving outcomes for each strategy.
  • Identify facilitators and barriers to implementation and evaluation plans for monitoring, evaluation, and dissemination.

Documents in the review include annual progress reports, proposed evaluation plans, evaluation reports, and proposed workplans.

Key Informant Interviews (Years 3 and 5 of the Program)

The program director or manager was interviewed to gather contextual information regarding the implementation of strategies, the perceived effect of reach and outcomes, facilitators and barriers to implementation and evaluation, and perceptions of program sustainability.

Strategy Inventory

Information on the programs’ activities and collaborations were collected.

Evaluation Data Collection Timeline

An environmental scan and key informant interviews were conducted in year one of the program to guide the overall evaluation design.

  • Document reviews were submitted each September (2020, 2021, and 2022).
  • Key informant interviews were completed in May 2021 and July 2022.
  • Data were collected for the strategy inventory from September to December 2020 and submitted in January 2021. In 2021, data were collected in January through June and submitted in July, then collected in July through December and submitted in January 2022. In 2022, data were collected in January through June and submitted in July.

How Evaluation Findings Are Used

  • Facilitate program improvement.
  • Identify lessons learned for future cooperative agreements.
  • Identify promising practices to allow CDC to support YBCS and MBC patients and their families better.
  • Disseminate products that describe program efforts and outcomes.
  • Communicate the effect of policy on YBCS programs.
  • Share data with award recipients for program improvement.

More Information