PCD logo

Dissemination and Implementation Science for Public Health Professionals: An Overview and Call to Action

Paul A. Estabrooks, PhD1; Ross C. Brownson, PhD2,3; Nicolaas P. Pronk, PhD4,5 (View author affiliations)

Suggested citation for this article: Estabrooks PA, Brownson RC, Pronk NP. Dissemination and Implementation Science for Public Health Professionals: An Overview and Call to Action. Prev Chronic Dis 2018;15:180525. DOI: http://dx.doi.org/10.5888/pcd15.180525external icon.

A Selective Review of the Origins of Dissemination and Implementation Science

Preventing Chronic Disease has a mission to enhance communication between researchers, public health professionals, and policy makers to integrate research and practice experience with a goal of improved population health. As a result, those involved in dissemination and implementation (DI) science — a growing field of study that examines the process by which scientific evidence is adopted, implemented, and sustained in typical community or clinical settings — have submitted and published their rigorous and relevant work in the journal with a high degree of success. Over the previous 2 years, the journal also added a new article type — Implementation Evaluation — to facilitate submission of articles that examine the implementation of evidence-based public health interventions in community and clinical settings. In an effort to continue the focus on DI, we wrote this commentary with the following objectives: 1) to provide a brief DI description, 2) to demonstrate the shared systems–based focus of DI science and public health practice, and 3) to highlight pathways to move public health–focused DI science forward. We reflect on our own learnings and by doing so hope to motivate more public health researchers and practitioners to engage in DI research.

DI research emerged — by name — over the past 25 years (1), but its roots can be traced to a much earlier time (2–4). A review of current DI research areas likely would not have seemed out of place in the 1930s through the 1960s. Some examples include the need for clinically relevant and community-relevant research (5), engaging systems and communities as partners in the co-creation of evidence (6), and examining the characteristics of interventions to determine which are more likely to be taken to scale and sustained (7). These topics can be traced back to the origins of action research in the 1940s, the push and pull between pure and applied research in the 1960s, and the diffusion of innovations that spanned both those periods. Indeed, the works of Kurt Lewin (8), Archie Cochrane (9), and Everett Rogers (3,10) provide a strong foundation for DI science.

Kurt Lewin founded the field of action research (4,8). He and other scientists of his day struggled against a paradigm that did not consider practice professionals in the development, implementation, and interpretation of scientific studies. In a critique that sounds like it could have come from the last American Public Health Association annual meeting, Lewin criticized the lack of integration of science and practice as a lost opportunity to understand group dynamics and organizational change processes while also contributing to achieving a community benefit through research. He argued for a pragmatic epistemological approach that combined social theory, experimental or quasi-experimental methods, and practice perspectives that could be used for local decision making and contribute to generalizable knowledge. He developed numerous participatory methods that engaged organizational representatives from the settings where social solutions would be applied, members of the population intended to benefit, and social scientists to collectively conduct diagnostic, participatory, empirical, and experimental action research (8). Action research, whether described as a systems-based approach, participatory dissemination, community-based participatory research, or integrated research–practice partnerships, provides a methodological basis for much of the current DI research. It also underscores the ideal outcomes of public health–focused DI research — a balance of demonstrating local impact while concurrently contributing to generalizable knowledge on how best to move evidence into practice.

Archie Cochrane — the inspiration for the thriving Cochrane collaborative (11) and the myriad of systematic reviews developed with a goal to provide a summary of evidence that can be used for health care practice and decision making — railed against the focus on pure research over applied research during the course of his career (9,12). Indeed, this quote captures his view of the existing research paradigm in the late 1940s: “I remember being advised by the most distinguished people that the best research should be utterly useless” (9 p432). Cochrane’s approach was grounded in his experience as a prisoner of war in Germany, where he provided care for thousands of soldiers and was concerned with the likelihood that he may have inadvertently provided therapies that did more harm than good because of the lack of scientific evidence for the medical approaches of the day. As a result, he became an advocate for the use of randomized controlled trials (RCTs) for practical, applied research that could contribute to health care practice in a timely manner. By the early 1970s Cochrane was advocating for systematic reviews of literature to compile the findings of research studies and allow for guideline and policy implementation across medical disciplines (2). Cochrane reviews and other systematic review approaches (13) are used broadly in DI and to support evidence-based public health (EBPH) practice as an indicator that a given intervention is either appropriate or inappropriate for broad-scale adoption, implementation, and sustainability.

Finally, Everett Rogers could be considered the Father of DI with his seminal work published in Diffusion of Innovations from the first edition in 1962 through the fifth and final edition in 2003 (3). With his roots in rural sociology, Rogers introduced a theoretical approach that considered the communication of an innovation, over time and through distinct channels, across a social system. He also proposed that an innovation could be described as an idea, practice, or product that is perceived as new to a social system. Rogers’s introduction of the S-shaped curve demonstrated the relative rate of adoption across early innovators and adopters with a slower rate of spread of an innovation followed by a steep increase as the early and late majority take up the innovation, followed by a slowing of the rate of adoption when system laggards (a term Rogers referred to in personal communications as one for which he wished he had come up with a less “inherently negative” label) take up the innovation.

The characteristics of an innovation — compatibility, complexity, observability, relative advantage, and trialability (how easily an innovation can be tested) — that Rogers proposed are still foundational across the primary theoretical approaches being applied in DI research (14–16). In addition to being a foundation of DI theories, Rogers’s work is the basis of many DI process models (ie, models that provide a guiding process for moving an innovation into practice) that include stages and focus on providing knowledge about the innovation, persuasion based on the innovation characteristics, selection of an appropriate innovation, testing the innovation through implementation, and confirming whether the innovation achieved the desired results for a sustainability or discontinuation decision (3,17–19). A simplified description of Rogers’s theory is the application of this decision-making process across an S-shaped curve highlighting differences in the rate individuals or settings adopt a new innovation (ie, innovators, early adopters, early majority, late majority, and laggards), where an innovation enters a social system based on the activities of innovators and early adopters and the perceived characteristics of the innovation (ie, compatibility, complexity, cost, observability, relative advantage, and trialability). It is important to note that consideration of innovation characteristics and the applicability of the innovation–decision process occurs across and within each adopter category in a social system.

In case you think that these issues are not as relevant today as they were three-quarters of a century ago, the promulgation of evidence, the lack of relevance of evidence, and the time and capacity needed for public health professionals to adapt and implement new interventions has resulted in a considerable evidence–practice gap (20). Cochrane would be thrilled with the advances in summarizing research for implementation decisions, but we speculate that he would be disheartened to know that it takes 17 years for 14% of original research to make its way to practice (21). Furthermore, some scientists have posited that the largest return on the public and private investment for the approximately $116 billion that is spent annually on biomedical research in the United States will be from DI research focused on translating currently available research on behavioral contributors to public health — tobacco use, dietary intake, and physical activity (22,23). This underscores the need for scientific advancement in speeding the translation of public health research to EBPH practice.

Top

Current Dissemination and Implementation Theoretical, Process, and Outcome Models

The more recent emergence of the DI field can be traced back to the early 1990s and the energy focused on developing a myriad of DI models to address the evidence–practice gap (17,24–26). To guide public health professionals in framing their DI work, a classification system was developed that arranged models into 3 primary categories — process, explanatory, or outcome models (17). Process models are those that specify steps, stages, or phases necessary to speed the adoption, implementation, and maintenance of evidence-based interventions in clinical or community settings (19). Explanatory models are theoretical approaches to DI and specify important constructs that can predict implementation outcomes and include propositions that can be tested scientifically (3). Outcome models provide a set of potential targets for DI research and allow researchers and public health professionals to plan implementation strategies for specific outcomes and to determine the level of success of a given project or initiative (27).

Most DI researchers use a process model, though few characterize the specific steps taken at each phase of a DI project (6). However, the recently published Practical Planning for Implementation and Scale-up (PRACTIS) guide is a nice example of a process model. PRACTIS was explicitly developed to provide a step-by-step approach for researchers interested in engaging in DI work relative to physical activity promotion in clinical and community settings (19). The guide directs investigators through 4 overarching steps that include 1) identifying and characterizing the implementation setting, 2) identifying and engaging key stakeholders across multiple levels within the implementation setting, 3) characterizing barriers and facilitators to implementation, and 4) problem-solving to address potential barriers. Each step includes numerous activities to complete with the ultimate goal focusing on co-creation of implementation strategies and new evidence to support future implementation initiatives. An appealing feature of PRACTIS and other process models (28–30) is that they provide a set of algorithms and pathways based on if–then questions on potential roadblocks that may be encountered during the implementation process (19).

It’s hard not to use Rogers’s Diffusion of Innovations as our example of an explanatory model (3). This theory has been applied broadly, and despite the label of “diffusion,” it includes many propositions and hypotheses that can be applied to proactive adoption, implementation, and maintenance research studies; at the time of our writing this article Rogers’s work had been cited nearly 97,000 times. Diffusion of Innovations concepts have been adapted and integrated into DI-specific theories in an effort to more thoroughly operationalize theoretical constructs and expand them to define the uptake, use, and sustainability of evidence-based interventions. For example, Wandersman et al’s interactive systems framework for dissemination and implementation proposes that 3 systems interact to either facilitate or inhibit research–practice translation: a delivery system, a research synthesis and translation system, and a translational support system (16). The framework provides numerous testable hypotheses, for example, that systemic readiness for adoption of an innovation is a function of the underlying motivation for adoption based on perceptions of the innovation’s relative advantage and compatibility with systems resources, the general capacity of the system to adopt new innovations (eg, transformational leadership, organizational innovativeness), and innovation-specific capacity based on systemic implementation supports and local expertise relative to the new innovation. Having explanatory theories and applying them is a critical component to move, as Lewis and colleagues recently wrote, from simply characterizing DI to advancing the understanding of the underlying mechanisms of change (25).

DI outcomes can be generally categorized as implementation, service, or client outcomes (31). One of the earliest proposed and most cited outcome models was the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) planning and evaluation framework published by Glasgow and colleagues in 1999 (27). RE-AIM’s goal was to provide a framework that would balance the focus on internal and external validity to improve the translation of public health interventions to practice. Researchers were encouraged to consider external validity factors associated with the population intended to benefit from the evidence-based intervention when planning and evaluating a project, including reach (penetration into the population and representativeness of those exposed to intervention efforts), effectiveness (changes in health outcomes for those exposed to the intervention), and maintenance (durability of changes in health outcomes for those exposed to the intervention). Researchers were also encouraged to consider contextual factors related to adoption at the staff and setting level (penetration into the population of potential staff and organizational delivery systems and their representativeness), implementation (cost, quality, consistency of delivery), and maintenance of implementation at the staff and organization level (durability of the quality and consistency of delivery). The RE-AIM framework has evolved over 20 years to include consideration of qualitative and quantitative data, consideration of cost across RE-AIM dimensions, and possible combinations of metrics to assess public health impact (32–37).

Top

The Natural Overlap of Public Health and Dissemination and Implementation Science: Systems-Based Approaches

Ultimately, public health practice is about changing systems through the use of an underlying evidence base, documenting outcomes of systems change, and capturing the underlying reasons (ie, mechanisms) of why a systems change occurred to allow for replication within and across public health settings (6). It is through this lens of systems that we consider a major goal of public health practice and DI science: to accelerate the uptake of evidence-based programs, practices, and policies in public health settings. A primary challenge to public health professionals and DI researchers alike is relevance of evidence developed through research studies to the contextual reality of practice settings (38). Few evidence-based interventions can be implemented according to the same protocol, with the same resources, and the same level of expertise when translated from a research setting to a practice, system, or policy setting (39,40). Furthermore, top–down rollouts of an evidence-based intervention where talented and effective professionals are working to achieve population-level impacts can possibly inhibit innovation and lead to poor outcomes.

Systems-based, collaborative processes for DI ideally engage practice partners that contribute across the life course of any one project and often across multiple projects (6,41,42). These processes include generation of the research question, development of implementation strategies, adaptation of evidence-based interventions, selection of the research design, implementation of the research, and interpretation of the results. This process evolved from the Cochrane era, when limited evidence from different medical fields existed and the RCT was promoted as a gold standard. Indeed, an RCT is often not feasible when applied to DI projects and could inhibit intervention adoption (43,44). One characteristic of DI work is a reliance on matching research designs to specific problems and a focus on pragmatism to answer questions that can benefit the system that is partnering on the research (45). Part of the legacy of Lewin’s work can be traced to complex systems and systems-thinking tools that are foundational areas of learning for public health professionals (46). Systems thinking tools such as multisector collaboration, iterative learning processes, and transformational leadership require the opportunity for a much broader adaptability of evidence-based interventions based on the underlying principles or processes (46). Public health professionals are often conveners and organizers of a cross section of community groups interested in improving population health (47). This convening role may include a horizontal systems approach that engages all stakeholder organizations that could be involved in the implementation of an evidence-based approach (eg, employers, faith-based organizations, community centers) as well as a vertical systems approach that acknowledges the need to engage both an administrative decision maker and staff members who would ultimately be responsible for implementation (48).

A key systems-thinking tool that is used in DI research and public health practice is an iterative learning process that includes the 1) identification of priority areas or needs within a system (similar to community needs assessments completed by public health departments), 2) matching of available evidence to the identified need (community action plans), 3) piloting or implementing strategies, 4) evaluating outcomes, and 5) deciding if a strategy should be sustained, adapted, or abandoned. Within this iterative learning process, numerous things need to be considered, and much is related to how we define evidence and evidence-based practice. Beginning with evidence-based practice, evidence is both used and produced by public health professionals (6). Kohatsu defined this EBPH approach as “the process of integrating science-based interventions with community preferences to improve the health of populations” (49 p419). The concept was also expanded to focus on evidence-based principles that can be used in the context of evidence-informed decision making (50–53). This concept recognizes that public health decisions are based not only on research but also on the need to consider key contextual variables (eg, political and organizational factors) (54).

Evaluation of the application of evidence-based principles and processes in the context of real-world systems is key in the iterative learning process. Often, the focus of evaluation is to provide evidence that the contribution of one or a few factors make a difference among a set of predetermined outcomes while holding all other factors constant. A systems-based approach acknowledges that in public health practice, all factors, even (or especially) those that are not being measured, are dynamic rather than static and influence the context of the evaluation. In other words, evaluations of programs, practices, and policies in the field of health and well-being are complex. The introduction of complex systems science provides an opportunity to consider an approach to evaluation that optimizes the context, does not attempt to control variables that cannot be controlled, and may be helpful in evolving the field of evaluation and pragmatic DI research to become more responsive to the needs of practitioners and decision makers (55,56).

Systems-based approaches, by nature, cannot be completed without representation from the system that is intended to change (6). This approach ensures understanding of system goals, resources, and structure and is especially critical to decision making. This approach also allows for translational solutions to align with and be responsive to the organizational context. Alignment with organizational practice priorities is paramount, and the inclusion of decision makers and implementation staff allows for both a consideration of priorities and the practicality of implementation. This need for alignment means that communications inside and outside the organization need to be kept simple and couched deeply in the work so that across all affected, people understand each other. This approach also allows for systems to set milestones and criteria necessary to determine whether a new EBPH strategy should be continued, adapted, or discontinued (6).

Top

A Call to Action for Public Health Practice and Dissemination and Implementation Science

Throughout this article we have highlighted the sustained relevance of the work of some of the giants in our field, provided descriptions of the importance of explanatory, process, and outcome models, and outlined the common systems-focused basis for DI science and EBPH. Here, and in the Table, we have generated some recommendations and a call to action to align with a detailed scope of EBPH (57) and suggest public health professionals should 1) make decisions based on the best available, peer-reviewed evidence, 2) use data and information systems systematically, 3) apply planning frameworks that address population health and implementation outcomes, 4) engage community (and when feasible, research) partners in decision making, 5) conduct sound evaluation across population and implementation outcomes, and 6) share what is learned.

Building and sustaining opportunities for DI science in public health practice requires a combined emphasis on developing individual and team-based skills and capabilities as well as organizational capacity (5,58). Individual skills needed cover a range of core areas including community assessment, adaptation of evidence-based programs and policies, descriptive epidemiology, implementation and action planning, and evaluation. Team-based capabilities include skills to collaborate and the ability to bring together the necessary individual skills within work groups to optimize efficiency. To complement these individual and team-based skills, key organizational capacity includes supervisors’ expectations to use EBPH, access to information resources (eg, academic journals), and a culture and climate supportive of EBPH.

As the field of DI science continues to mature, there is increasing urgency and need for new and expanded approaches for building DI research and practice capacity (59). Because many public health researchers and practitioners lack formal training in one or more core public health disciplines, on-the-job training is urgently needed to improve DI-related skills. In recognition of this need, the important role of capacity building (eg, more training and skill development among professionals) has been noted as a “grand challenge” for public health (60). Capacity for DI research has typically been built via some combination of graduate courses, degree programs, training institutes, workshops, conferences, and online resources. Using a concept mapping process (61), we identified a set of 9 essential concept clusters (Table). To apply these competencies, several large-scale DI research training programs (eg, Dissemination and Implementation Research in Health [62]), the Implementation Research Institute [63], Mentored Training for Dissemination and Implementation Research in Cancer [64]) and smaller scale regional training programs (eg, Great Plains IDeA CTR Dissemination and Implementation Research Workshop [65], and the University of Colorado Designing for Dissemination Workshop [66]) have been conducted. Similarly, several ongoing practitioner training programs support capacity building (57,67). For example, the Physical Activity and Public Health Course for Practitioners, has shown positive benefits in building capacity to design, implement, and evaluate interventions (68).

Given that the field of DI research is relatively young, many gaps exist in the science (69). Some of the most critical issues for practitioners are two closely-related concepts: scalability and sustainability. Scalability involves efforts to follow a systematically timed, planned, and graded series of steps that cumulatively account for the continuously increasing reach or adoption of an intervention until a critical mass is attained the entire target population is engaged (70), or the efforts to increase the impact have been successfully tested in pilot or experimental projects to foster policy and program development on a lasting basis, thus addressing population health and inequalities (71). Sustainability has been described as the extent to which an evidence-based intervention can deliver its intended benefits over an extended period after external support from the donor agency is terminated (72) or as the long-term, ongoing support for a program in relation to an accepted value proposition that balances allocated resources against generated revenues or benefits and includes the confirmation of long-term program support through adequate proof of performance (70). A priority area for research focuses on how best to overcome barriers to scalability and sustainability that limit the benefits of evidence-based practices (73). To date, much of DI practice and research has focused on initial uptake by early adopters of one health intervention at a time. Public health professionals are in a unique position to address challenges of scalability and sustainability with a systems approach, supporting uptake and maintenance of EBPH in complex community settings that serve vulnerable populations.

In summary, a rich body of research knowledge is not moving into the hands of practitioners and policy makers as quickly and efficiently as needed. The DI approaches outlined here can begin to speed up this translation. In doing so, we will more effectively apply EBPH approaches that will use resources more efficiently, account for dollars spent, and increase impact. We encourage public health professionals — in their day-to-day work — to generate evidence that is relevant and describes how best to implement new evidence-based strategies, report on the reasons why those strategies worked, and track the effect of those strategies on implementation and population health.

Top

Acknowledgments

The authors have no conflicts of interest to declare. Dr. Estabrooks is supported in part by the National Institute for General Medical Sciences at the National Institutes of Health (U54 GM115458-01 Great Plains IDeA CTR). Dr. Brownson is supported by the National Cancer Institute at the National Institutes of Health (5R01CA160327), the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK Grant Number 1P30DK092950), and Washington University Institute of Clinical and Translational Sciences (UL1 TR000448), and from the National Center for Advancing Translational Sciences (KL2 TR000450). Dr. Nicolaas P. Pronk is supported, in part, by the National Institute for Occupational Safety and Health at the Centers for Disease Control and Prevention (2U19OH008861-12).

Top

Author Information

Corresponding Author: Paul A. Estabrooks, PhD, Department of Health Promotion, College of Public Health, University of Nebraska Medical Center, 984365 Nebraska Medical Center, Omaha, NE 68198. Telephone: 402-559-4325. Email: paul.estabrooks@unmc.edu.

Author Affiliations: 1Department of Health Promotion, College of Public Health, University of Nebraska Medical Center, Omaha, Nebraska. 2Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis, St. Louis, Missouri. 3Department of Surgery (Division of Public Health Sciences) and Alvin J. Siteman Cancer Center, Washington University School of Medicine, Washington University in St. Louis, St. Louis, Missouri. 4HealthPartners Institute, Bloomington, Minnesota. 5Harvard T.H. Chan School of Public Health, Department of Social and Behavioral Sciences, Boston, Massachusetts.

Top

References

  1. Lomas J. Diffusion, dissemination, and implementation: who should do what? Ann N Y Acad Sci 1993;703:226–35, discussion 235–7. CrossRefexternal icon PubMedexternal icon
  2. Cochrane AL. Effectiveness and efficiency. Random reflections on health services. London: Nuffield Provincial Hospitals Trust; 1972.
  3. Rogers EM. Diffusion of innovations. 4th edition. New York (NY): The Free Press Simon and Schuster, Inc; 2003.
  4. Lewin K. Experiments in social space. Harv Educ Rev 1939;9(9): 21–32,
  5. Brownson RC, Fielding JE, Green LW. Building capacity for evidence-based public health: reconciling the pulls of practice and the push of research. Annu Rev Public Health 2018;39(1):27–53. CrossRefexternal icon PubMedexternal icon
  6. Estabrooks PA, Glasgow RE. Translating effective clinic-based physical activity interventions into practice. Am J Prev Med 2006;31(4, Suppl):S45–56. CrossRefexternal icon PubMedexternal icon
  7. Pronk MC, Blom LT, Jonkers R, Rogers EM, Bakker A, de Blaey KJ. Patient oriented activities in Dutch community pharmacy: diffusion of innovations. Pharm World Sci 2002;24(4):154–61. CrossRefexternal icon PubMedexternal icon
  8. Adelman C. Kurt Lewin and the origins of action research. Educ Action Res 1993;1(1):17. CrossRefexternal icon
  9. Cochrane AL. Archie Cochrane in his own words. Selections arranged from his 1972 introduction to “Effectiveness and Efficiency: Random Reflections on the Health Services” 1972. Control Clin Trials 1989;10(4):428–33. CrossRefexternal icon PubMedexternal icon
  10. Rogers EM. Diffusion of preventive innovations. Addict Behav 2002;27(6):989–93. CrossRefexternal icon PubMedexternal icon
  11. Cochrane Collaborative. https://www.cochrane.org. Accessed September 16, 2018.
  12. Chalmers I, Dickersin K, Chalmers TC. Getting to grips with Archie Cochrane’s agenda. BMJ 1992;305(6857):786–8. CrossRefexternal icon PubMedexternal icon
  13. Briss PA, Brownson RC, Fielding JE, Zaza S. Developing and using the Guide to Community Preventive Services: lessons learned about evidence-based public health. Annu Rev Public Health 2004;25(1):281–302. CrossRefexternal icon PubMedexternal icon
  14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4(1):50. CrossRefexternal icon PubMedexternal icon
  15. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 2004;82(4):581–629. CrossRefexternal icon PubMedexternal icon
  16. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol 2008;41(3-4):171–81. CrossRefexternal icon PubMedexternal icon
  17. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015;10(1):53. CrossRefexternal icon PubMedexternal icon
  18. Smith-Ray RL, Almeida FA, Bajaj J, Foland S, Gilson M, Heikkinen S, et al. Translating efficacious behavioral principles for diabetes prevention into practice. Health Promot Pract 2009;10(1):58–66. CrossRefexternal icon PubMedexternal icon
  19. Koorts H, Eakin E, Estabrooks P, Timperio A, Salmon J, Bauman A. Implementation and scale up of population physical activity interventions for clinical and community settings: the PRACTIS guide. Int J Behav Nutr Phys Act 2018;15(1):51. CrossRefexternal icon PubMedexternal icon
  20. Brownson RC, Eyler AA, Harris JK, Moore JB, Tabak RG. Getting the word out: new approaches for disseminating public health science. J Public Health Manag Pract 2018;24(2):102–11. CrossRefexternal icon PubMedexternal icon
  21. Balas EA, Boren SA. Mapping clinical knowledge for healthcare improvement. In: Bemmel J, McCray A, editors. Yearbook of medical informatics 2000: Patient-centered systems. Stuttgart: Schattauer; 2000. p. 65-70.
  22. Colditz GA, Emmons KA. The promise and challenges of dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York (NY): Oxford University Press; 2018. p. 1–17.
  23. Moses H 3d, Matheson DH, Cairns-Smith S, George BP, Palisch C, Dorsey ER. The anatomy of medical research: US and international comparisons. JAMA 2015;313(2):174–89. CrossRefexternal icon PubMedexternal icon
  24. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med 2012;43(3):337–50. CrossRefexternal icon PubMedexternal icon
  25. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health 2018;6:136. CrossRefexternal icon PubMedexternal icon
  26. Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, et al. Measurement resources for dissemination and implementation research in health. Implement Sci 2016;11(1):42. CrossRefexternal icon PubMedexternal icon
  27. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999;89(9):1322–7. CrossRefexternal icon PubMedexternal icon
  28. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol 2012;50(3-4):462–80. CrossRefexternal icon PubMedexternal icon
  29. Titler MG, Kleiber C, Steelman VJ, Rakel BA, Budreau G, Everett LQ, et al. The Iowa model of evidence-based practice to promote quality care. Crit Care Nurs Clin North Am 2001;13(4):497–509. CrossRefexternal icon PubMedexternal icon
  30. Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Med J Aust 2004;180(6, Suppl):S57–60. PubMedexternal icon
  31. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011;38(2):65–76. CrossRefexternal icon PubMedexternal icon
  32. Kessler RS, Purcell EP, Glasgow RE, Klesges LM, Benkeser RM, Peek CJ. What does it mean to “employ” the RE-AIM model? Eval Health Prof 2013;36(1):44–66. CrossRefexternal icon PubMedexternal icon
  33. Estabrooks PA, Allen KC. Updating, employing, and adapting: a commentary on What does it mean to “employ” the RE-AIM model. Eval Health Prof 2013;36(1):67–72. CrossRefexternal icon PubMedexternal icon
  34. Harden SM, Smith ML, Ory MG, Smith-Ray RL, Estabrooks PA, Glasgow RE. RE-AIM in clinical, community, and corporate settings: perspectives, strategies, and recommendations to enhance public health impact. Front Public Health 2018;6:71. CrossRefexternal icon PubMedexternal icon
  35. Harden SM, Gaglio B, Shoup JA, Kinney KA, Johnson SB, Brito F, et al. Fidelity to and comparative results across behavioral interventions evaluated through the RE-AIM framework: a systematic review. Syst Rev 2015;4(1):155. CrossRefexternal icon PubMedexternal icon
  36. Glasgow RE, Estabrooks PE. Pragmatic applications of RE-AIM for health care initiatives in community and clinical settings. Prev Chronic Dis 2018;15:E02. CrossRefexternal icon PubMedexternal icon
  37. Dzewaltowski DA, Glasgow RE, Klesges LM, Estabrooks PA, Brock E. RE-AIM: evidence-based standards and a Web resource to improve translation of research into practice. Ann Behav Med 2004;28(2):75–80. CrossRefexternal icon PubMedexternal icon
  38. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval Health Prof 2006;29(1):126–53. CrossRefexternal icon PubMedexternal icon
  39. Chambers DA, Norton WE. The Adaptome: advancing the science of intervention adaptation. Am J Prev Med 2016;51(4, Suppl 2):S124–31. CrossRefexternal icon PubMedexternal icon
  40. Pronk NP. Practice and research connected: a synergistic process of translation through knowledge transfer. In: Pronk NP, editor. ACSM’s worksite health handbook. 2nd edition. Champaign (IL): Human Kinetics; 2009. p. 92–100.
  41. Harden SM, Johnson SB, Almeida FA, Estabrooks PA. Improving physical activity program adoption using integrated research-practice partnerships: an effectiveness-implementation trial. Transl Behav Med 2017;7(1):28–38. CrossRefexternal icon PubMedexternal icon
  42. Johnson SB, Harden SM, Estabrooks PA. Uptake of evidence-based physical activity programs: comparing perceptions of adopters and nonadopters. Transl Behav Med 2016;6(4):629–37. CrossRefexternal icon PubMedexternal icon
  43. Handley MA, Lyles CR, McCulloch C, Cattamanchi A. Selecting and improving quasi-experimental designs in effectiveness and implementation research. Annu Rev Public Health 2018;39(1):5–25. CrossRefexternal icon PubMedexternal icon
  44. Kessler R, Glasgow RE. A proposal to speed translation of healthcare research into practice: dramatic change is needed. Am J Prev Med 2011;40(6):637–44. CrossRefexternal icon PubMedexternal icon
  45. Glasgow RE. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Educ Behav 2013;40(3):257–65. CrossRefexternal icon PubMedexternal icon
  46. Swanson RC, Cattaneo A, Bradley E, Chunharas S, Atun R, Abbas KM, et al. Rethinking health systems strengthening: key systems thinking tools and strategies for transformational change. Health Policy Plan 2012;27(Suppl 4):iv54–61. CrossRefexternal icon PubMedexternal icon
  47. Pronk NP. The role of a trusted convener in building corporate engagement in community health initiatives. ACSM’s Health Fit J 2018;22(1):3. CrossRefexternal icon
  48. Maclean LM, Clinton K, Edwards N, Garrard M, Ashley L, Hansen-Ketchum P, et al. Unpacking vertical and horizontal integration: childhood overweight/obesity programs and planning, a Canadian perspective. Implement Sci 2010;5(1):36. CrossRefexternal icon PubMedexternal icon
  49. Kohatsu ND, Robinson JG, Torner JC. Evidence-based public health: an evolving concept. Am J Prev Med 2004;27(5):417–21. PubMedexternal icon
  50. Armstrong R, Pettman TL, Waters E. Shifting sands — from descriptions to solutions. Public Health 2014;128(6):525–32. CrossRefexternal icon PubMedexternal icon
  51. Yost J, Dobbins M, Traynor R, DeCorby K, Workentine S, Greco L. Tools to support evidence-informed public health decision making. BMC Public Health 2014;14(1):728. CrossRefexternal icon PubMedexternal icon
  52. Downey SM, Wages J, Jackson SF, Estabrooks PA. Adoption decisions and implementation of a community-based physical activity program: a mixed methods study. Health Promot Pract 2011;13(2):175–82. CrossRefexternal icon PubMedexternal icon
  53. Glasgow RE, Phillips SM, Sanchez MA. Implementation science approaches for integrating eHealth research into practice and policy. Int J Med Inform 2013;83(7):e1–11. CrossRefexternal icon PubMedexternal icon
  54. Viehbeck SM, Petticrew M, Cummins S. Old myths, new myths: challenging myths in public health. Am J Public Health 2015;105(4):665–9. CrossRefexternal icon PubMedexternal icon
  55. Zimmerman B, Lindberg C, Plsek P. Insights from complexity science for health care leaders. Irving (TX): Edgeware; 2001.
  56. Pawson R, Tilley N. Realistic evaluation. Thousand Oaks (CA): Sage Publications; 1997.
  57. Brownson R, Baker E, Deshpande A, Gillespie K. Evidence-based public health. 3rd edition. New York (NY): Oxford University Press; 2018.
  58. Muir Gray JA. Evidence-based healthcare: how to make decisions about health services and public health. 3rd edition. New York (NY) and Edinburgh (SCT): Churchill Livingstone Elsevier; 2009.
  59. Chambers DA, Proctor EK, Brownson RC, Straus SE. Mapping training needs for dissemination and implementation research: lessons from a synthesis of existing D&I research training programs. Transl Behav Med 2017;7(3):593–601. CrossRefexternal icon PubMedexternal icon
  60. Daar AS, Singer PA, Persad DL, Pramming SK, Matthews DR, Beaglehole R, et al. Grand challenges in chronic non-communicable diseases. Nature 2007;450(7169):494–6. CrossRefexternal icon PubMedexternal icon
  61. Tabak RG, Padek MM, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, et al. Dissemination and implementation science training needs: insights from practitioners and researchers. Am J Prev Med 2017;52(3, 3S3):S322–9. CrossRefexternal icon PubMedexternal icon
  62. Meissner HI, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, et al. The U.S. training institute for dissemination and implementation research in health. Implement Sci 2013;8(1):12. CrossRefexternal icon PubMedexternal icon
  63. Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, et al. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci 2013;8(1):105. CrossRefexternal icon PubMedexternal icon
  64. Padek M, Mir N, Jacob RR, Chambers DA, Dobbins M, Emmons KM, et al. Training scholars in dissemination and implementation research for cancer prevention and control: a mentored approach. Implement Sci 2018;13(1):18. CrossRefexternal icon PubMedexternal icon
  65. Estabrooks PA, Zimmerman L. Great Plains IDeA CTR Dissemination and Implementation Workshop: key factors for developing strong proposals 2018. https://gpctr.unmc.edu/funding-training/training_opportunities/DI_Workshop_2018.html. Accessed September 16, 2018.
  66. Glasgow RE, Rabin BA. Designing for dissemination workshop 2018. http://www.ucdenver.edu/academics/colleges/medicalschool/programs/ACCORDS/sharedresources/DandI/Documents/D4D%20simple%20agenda_v5.pdf. Accessed September 16, 2018.
  67. Jacobs JA, Duggan K, Erwin P, Smith C, Borawski E, Compton J, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci 2014;9(1):124. CrossRefexternal icon PubMedexternal icon
  68. Evenson KR, Dorn JM, Camplain R, Pate RR, Brown DR. Evaluation of the Physical Activity and Public Health Course for Researchers. J Phys Act Health 2015;12(8):1052–60. CrossRefexternal icon PubMedexternal icon
  69. Brownson R, Colditz G, Proctor E. Future issues in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd edition. New York (NY): Oxford University Press; 2018. p. 481–90.
  70. Pronk NP. Designing and evaluating health promotion programs: simple rules for a complex issue. Dis Manag Health Outcomes 2003;11(3):8. CrossRefexternal icon
  71. Milat AJ, King L, Bauman AE, Redman S. The concept of scalability: increasing the scale and potential adoption of health promotion interventions into policy and practice. Health Promot Int 2013;28(3):285–98. CrossRefexternal icon PubMedexternal icon
  72. Shediac-Rizkallah MC, Bone LR. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res 1998;13(1):87–108. CrossRefexternal icon PubMedexternal icon
  73. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci 2015;10(1):88. CrossRefexternal icon PubMedexternal icon
  74. Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D. Evaluability assessment to improve public health policies, programs, and practices. Annu Rev Public Health 2010;31(1):213–33. CrossRefexternal icon PubMedexternal icon

Top

Table

Return to your place in the text

Table. Recommendations for Public Health Professionals to Engage in Dissemination and Implementation Research
Recommendation Action Steps
Leverage existing system drivers to provide opportunities to advance DI science
  • Study the process of adoption, implementation, and sustainability of new initiatives to integrate evidence-based principles/practice within your organization.
  • When adaptations are made to existing evidence-based approaches, report on the reasons for adaptation and the resulting characteristics of the newly adapted strategy.
  • Keep field notes to track the process of implementation, from the selection of an evidence-based approach to the testing of the impact of the approach, and share your results as public health DI case studies or implementation evaluations.
  • Partner with researchers whose mission is to move science forward in a way that will concurrently fulfill public health system needs (eg, establish academic health departments).
Focus on pragmatic evaluation
  • Use existing measures as a cornerstone of an iterative learning approach to document the success (or not) of new evidence-based strategies.
  • Use principles of evaluability when assessing new interventions that have not been previously evaluated (74).
Identify and use an explanatory, process, and outcome model in your work
  • Include, but move beyond, reporting on the effectiveness of your strategy to include a description of, if it worked, why it worked, and how you did it. Using consistent models across projects will allow for comparisons important in practice but will also provide research to move the DI field forward.
  • Use mixed methods and present the best data available to you. Qualitative data on outcomes, mechanisms that led to the outcomes, and processes that were used for implementation can move the field forward.
Develop key competencies related to DI science Seek out opportunities to develop capacity in the 9 key competencies for public health research and practice professionals
  • Communicate research findings
  • Improve practice partnerships
  • Make research more relevant
  • Strengthen communication skills
  • Develop research methods and measures
  • Consider fit between evidence and practice settings
  • Enhance fit between evidence and practice settings
  • Increase capacity for practical research
  • Understand multilevel context

 

Top

View Page In:pdf icon

The opinions expressed by authors contributing to this journal do not necessarily reflect the opinions of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors’ affiliated institutions.

Page last reviewed: December 20, 2018