Skip Navigation Links
Centers for Disease Control and Prevention
 CDC Home Search Health Topics A-Z

Preventing Chronic Disease: Public Health Research, Practice and Policy

View Current Issue
Issue Archive
Archivo de números en español








Emerging Infectious Diseases Journal
MMWR


 Home 

Volume 2: No. 4, October 2005

SPECIAL TOPIC
Twelve Essentials of Science-based Policy


TABLE OF CONTENTS


Translation available Este resumen en español
  Ce résumé est en français
  這是英文摘要
  这是英文摘要
Print this article Print this article
E-mail this article E-mail this article:



Send feedback to editors Send feedback to editors
Download this article as a PDF Download this article as a PDF (347K)

You will need Adobe Acrobat Reader to view PDF files.


Navigate This Article
Abstract
Introduction
Knowledge Generation
Knowledge Exchange
Knowledge Uptake
Conclusion
Acknowledgments
Author Information
References
Tables


Bernard C.K. Choi, PhD, MSc

Suggested citation for this article: Choi BCK. Twelve essentials of science-based policy. Prev Chronic Dis [serial online] 2005 Oct [date cited]. Available from: URL: http://www.cdc.gov/pcd/issues/2005/
oct/05_0005.htm
.

PEER REVIEWED

Abstract

This article presents a systematic framework of 12 essentials, or basic elements, of science-based policy. The 12 essentials are grouped into three categories, or areas, as follows: 1) knowledge generation, which includes credible design, accurate data, sound analysis, and comprehensive synthesis; 2) knowledge exchange, which includes relevant content, appropriate translation, timely dissemination, and modulated release; and 3) knowledge uptake, which includes accessible information, readable message, motivated user, and rewarding outcome.

Back to top

Introduction

The relationship between science and policy is an important topic in evidence-based public health policy and practice (1). It seems logical to assume that as scientific research generates more quality findings, policymakers will make better decisions. However, numerous underlying obstacles exist (2).

A systematic framework can be used to describe the key components that link science to policy. The framework, which consists of three areas that are subdivided into 12 essentials (basic elements), reveals issues and solutions related to science-based decision making. In this article, policy is defined broadly to include not only legislation but also “prudence or wisdom in the management of affairs” and “a definite course or method of action selected from among alternatives in light of given conditions to guide and determine present and future decisions” (3). Therefore, the term policymakers may encompass public health practitioners, public health researchers, and even the general public, because members of the general public make health decisions for themselves and their families.

Science-based policy involves producing high-quality scientific evidence, building bridges between the producers and users of scientific evidence, and incorporating scientific evidence into health policy and practice (4). Accordingly, the three primary areas in science-based policy are knowledge generation, knowledge exchange, and knowledge uptake (Table 1). Within these three areas, the 12 essentials are categorized as follows: knowledge generation — 1) credible design, 2) accurate data, 3) sound analysis, and 4) comprehensive synthesis; knowledge exchange — 5) relevant content, 6) appropriate translation, 7) timely dissemination, and 8) modulated release; and knowledge uptake — 9) accessible information, 10) readable message, 11) motivated user, and 12) rewarding outcome (Table 1).

The names of the three areas described in this framework vary in other articles. For example, knowledge generation (5,6) has also been called knowledge acquisition (7) and knowledge creation (8); knowledge exchange (6,9-11) has been called knowledge dissemination (7,8,12), knowledge transfer (9,11), knowledge brokering (10), knowledge translation (13), and knowledge access (5); and knowledge uptake (6,9) has been referred to as knowledge application (7,8), knowledge utilization (8,12), and knowledge use (5). The meanings of the terms vary slightly. For example, the term dissemination implies a one-way transmission of knowledge, whereas the terms transfer and exchange imply a two-way transfer of information (14). The term brokering seems to be associated with a process, the objective of which is to exchange information (10).

Back to top

Knowledge Generation

Credible design

Ideally, evidence for policy decisions should be generated from scientific research based on high-quality study designs. In general, the strength of data generated by various study designs results in a hierarchical pattern. Experimental studies such as clinical trials and field trials provide strong evidence; community trials and observational studies such as cohort studies and case-control studies provide moderate evidence; other observational studies such as historical cohort studies, cross-sectional studies, and ecological studies provide weak evidence; and case reports and news reports provide minimal evidence (15-18).

However, the scientific evidence hierarchy is often turned upside down when policy decisions are being made. News reports and case reports often play an important role in policy decisions, because decision makers, including those in the general public, often do not have the time, ability, or expertise to access and synthesize the evidence from high-quality studies. For example, in 1999, the newspaper USA Today published the following health-related headlines: “‘Scars’ May Be Cancer Predictor,” “Persistent Heartburn Is a Cancer Warning Sign,” “Two Drinks a Day Keep Stroke Away,” “Study: High-Fiber Diets Don’t Cut Colon Cancer,” and “No Link Found Between Fat, Breast Cancer” (19). News headlines can be based on inconclusive evidence (e.g., “may be”), scare tactics (e.g., “warning sign”), disregard of details (e.g., the health risks of drinking, such as liver disease), and conflicting messages (e.g., reporting results that are different from numerous other studies).

Even when scientific evidence is produced from adequately designed studies, current knowledge generation can be hindered by a false-positive research cycle (Figure) (20). Consider the following scenario. Evidence relating cellular telephone use and brain tumors is still inconclusive, despite the multiple studies that have been done and the widespread attention given to the topic (21). Assume the null hypothesis is true — that cellular telephone use does not cause brain tumors. In addition, assume that as a result of chance or bias (and not a high-quality study), some researchers report finding an association between cellular telephone use and brain tumors (a false-positive study). Publication of the false-positive study creates concern, and the problem becomes a hot topic (hot topic bias) which results in more studies — perhaps even 100 — that are designed to investigate the potential problem (22). At the conventional significance level, or a level, of .05, five of the 100 studies are expected to have positive results (23); in other words, five of the studies in this example would be expected to find that cellular telephone use causes brain tumors. The researchers who obtain positive results are more likely to document their results and submit papers to a journal (positive results bias), and journal editors are more likely to publish studies with positive results (editor’s bias) (20). In other words, the five positive studies (which are actually false-positive studies) are more likely to be published, and few of the other 95 studies that did not find an association between cellular telephone use and brain tumors will be published (publication bias). The five additional false-positive studies make the topic even more urgent in the research community, and the false-positive research cycle begins again as more studies are designed to assess the problem. Through this biased process, researchers can often “prove” something out of nothing.

This figure is a top-down flow chart showing the false-positive research cycle. The first box reads, "Two factors are not associated, but a study reports that they are." A solid arrow points down to the next box, which reads, "The false-positive study creates concern." A solid arrow accompanied by the words "hot topic bias" points down to the next step in the cycle. The next box reads, "100 studies are designed to address 'the problem.'" A solid arrow points down to the next box, which reads, "If alpha = .05, five of the studies will have false-positive results." A solid arrow accompanied by the words "positive results bias" points down to the last box, which reads, "The five false-positive studies will be published." A solid arrow accompanied by the words "editor's bias" points from the last box back up to the first box, showing that the cycle begins again.

Figure. The false-positive research cycle.

Accurate data

Bias is defined as the “deviation of results or inferences from the truth, or processes leading to such deviation” (24). The best way to identify bias is by comparing results with the truth, or a gold standard. For example, researchers conducted a study to determine the baseline accuracy of dentists’ readings of dental radiographs (bitewings) (25). The study’s methodology involved constructing 15 models of the posterior part of a natural dentition. The models had extracted teeth mounted in a medium with a radiodensity similar to that of human bone. Bitewing radiographs were taken of the simulated dentitions. The teeth used in the model mounts were removed from the models, serially sectioned, and examined with a microscope for caries. The results of the microscopic examination were established as the gold standard. Dentists were asked to read independently the 15 sets of bitewing radiographs and make treatment decisions about the teeth. The agreement between the dentists’ readings and the gold standard established by the microscope results was poor (mean κ = 0.35) (25).

Even laboratory tests cannot guarantee the accuracy of a study’s data. For example, many physicians use four different types of laboratory tests to diagnose leukemia (routine morphology testing, electron microscopy, cell surface marker identification, and cancer cytogenetics), and the four test results often seem contradictory. An interrater agreement study was conducted, with each of the four laboratories being classified as a rater. The interrater reliability result confirmed that the results were contradictory. Results from the four diagnostic laboratories correlated poorly for cell type identification in leukemia (pairwise κ = 0.17–0.40) (26).

Health data often come from health surveys using questionnaires, and the accuracy of the data is likely affected by questionnaire biases. For example, questions or answers may be phrased in a way that misleads respondents and causes them to make an incorrect choice (framing bias) (27). An example of framing bias follows:

Which operation would you prefer?
[ ] An operation that has a 5% mortality
[ ] An operation that 90% of the patients will survive

People may choose the second option when they read “90%” and “survive,” even though a 90% survival rate (which is a 10% mortality rate) is actually worse than a 5% mortality rate.

According to a comprehensive assessment of 109 instances of bias that were found in scientific research (literature review, 4; study design, 31; study execution, 3; data collection, 46; analysis, 15; interpretation, 7; publication, 3), most of the instances of bias were found in the data collection phase of research (46 of 109, or 42%, of the total instances) (22).

Sound analysis

Failure to control for confounding effects is a common problem in data analysis. A confounder is a factor “that can cause or prevent the outcome of interest, is not an intermediate variable, and is associated with the factor under investigation” (24). For example, if researchers were studying the association between drinking alcohol and lung disease, they would need to treat smoking as a potential confounder because 1) smoking is known to cause lung disease, and 2) smoking and drinking alcohol are often associated behaviors. Techniques to control for confounding include stratification and mathematical modeling (28,29).

Failing to conduct a sound data analysis could completely change the results of a study. In a mass screening for colorectal cancer, Zheng et al evaluated the accuracy of occult blood testing, using rectoscopy as the gold standard for comparison (30). Clinical and epidemiological data from 60,496 individuals were collected. It was found that of the 477 individuals who had colorectal cancer diagnosed by rectoscopy (the gold standard), 437 were identified as having colorectal cancer by the occult blood test. This corresponded to a test sensitivity of 92% (437/477), which indicated that the occult blood test was a good screening test for colorectal cancer. The results were submitted to a scientific journal, and comments from two reviewers were received. One reviewer was pleased with the study and recommended publication. The other reviewer pointed out a gross error in the calculations and mentioned “work-up bias.” According to the original paper on work-up bias, it is not an easy issue to address (31). An appropriate mathematical procedure was subsequently developed to address the work-up bias (32). Using the new procedure, the occult blood test sensitivity was recalculated to be 28%, indicating that it was not a good screening test for colorectal cancer (30). Therefore, the proper analysis completely reversed the study’s conclusion.

Comprehensive synthesis

Scientific papers are being published constantly. Approximately 30,000 biomedical journals are being published currently, and 17,000 new biomedical books are published every year. On average, physicians would have to read 19 articles each day to stay knowledgeable about new developments in their field (33,34). Comprehensive syntheses of current information are needed to address potential problems such as lack of time and lack of expertise (35). Comprehensive syntheses include narrative reviews, systematic reviews, meta-analyses, meta-databases, inventories of best practices, and public health observatories. They make life easier for consumers of scientific material, such as scientists, physicians, and policymakers, collectively referred to as knowledge users in this article.

A narrative review is a summary of the literature that exists on a particular topic; informal and subjective methods are used to collect and interpret information (33,36). A systematic review is a summary that is written after a comprehensive search for relevant studies and then evaluated and synthesized according to a predetermined and an explicit method (33,37,38). A meta-analysis (an analysis of several analyses) takes a systematic review one step further by mathematically aggregating available data from independent studies to yield a more statistically powerful estimate (33,36,39). A meta-database (a database of several databases) includes information about the location, source, content, and other details of the relevant databases (40). An inventory of best practices (or better practices) is created using an approach based largely on less rigorous study designs of practices and programs. The inventory often focuses on particular organizational behaviors for which conclusive quantitative evaluations are difficult to design and execute (41). A public health observatory is more detached from actual health phenomena and events, provides objective descriptions and analyses, and provides forecasting of patterns, interrelationships, processes, and public health outcomes (42,43).

Comprehensive syntheses can be a major undertaking. For example, a lifestyle modification guide was created to prevent and control hypertension. It was a 50-page supplementary issue of a scientific journal based on a review of 37 years (1960 to 1996) of scientific literature on weight, alcohol, exercise, sodium, calcium, magnesium, potassium, and stress and their effects on the body (44).

Some comprehensive syntheses require a review of not only contemporary literature but also historical literature. For example, one analysis was composed of 12 lessons for public health surveillance in the twenty-first century. The lessons were created after conducting a broad review of the historical documents on major epidemics during the past 5000 years (since 3180 BC) and included the plague, smallpox, dancing mania, cholera, the Spanish flu, and lung cancer (45).

Back to top

Knowledge Exchange

Relevant content

Information should not be disseminated all at once and should not be provided to everyone. Only relevant information needs to be disseminated. For example, depending on the audience, one of two information dissemination approaches can be used: the encyclopedia approach or the fire-alarm approach (46). The encyclopedia approach involves conveying all available information in the form of reports, atlases, Web sites, and other methods. This type of information is needed by knowledge users such as scientists and certain policymakers who need extremely detailed information.

For most policymakers and the general public, the fire-alarm approach may be more appropriate. This approach involves only conveying information when selected indicators are not in the normal range and indicate a potential problem. For example, it has been proposed that new composite indicators for public health, similar to economic indicators such as the Dow Jones average or the consumer price index, be developed to document the relevant health information needed for public health decisions (20,47). Many stockholders trade successfully by buying or selling their stock holdings based on the performance of economic composite indicators. In a similar way, indicators such as a national health index, national heart health index, and national diet index could be helpful to health policymakers.

Appropriate translation

As scientists make new discoveries, more sophisticated methods and theories are developed. At some point, the average policymaker and even some scientists cannot understand the information. The key is to strike a balance between providing all available information and providing what is needed by knowledge users.  “Complex models with simple model-user interface” can be used to achieve this goal (48). Following is an example of how such a model-user interface was created for public health practitioners.

During the first months of the severe acute respiratory syndrome (SARS) epidemic in 2003, a mathematical model was developed to predict the spread of SARS (49). The mathematical model can be thought of as a machine, with the engine of the machine comprising a series of four mathematical equations:

Cti = R0t
C = ∑ Cti
Dti + d = Cti ´ F
D = ∑ Dti + d

where Cti indicates the predicted number of incident cases on day ti, and t is time expressed in the number of incubation periods; C, the predicted total number of cases; Dti + d, the predicted number of deaths on day ti + d, and t is time expressed in the number of incubation periods; D, the predicted total number of deaths; R0, the basic reproductive number (i.e., the expected number of new infectious cases per infectious case); F, the case-fatality rate (i.e., the proportion of cases who die within the symptomatic period); i, the incubation period (i.e., the time from infection to symptoms); and d, duration of disease (i.e., the time from symptoms to recovery or death).

These equations are complex but do not have to be understood to be used, just as a person who drives a car does not have to understand how the engine works. The model-user interface is simple. The required information for using the previous SARS model to predict the number of SARS cases and deaths consists of only R0, F, i, and d, and the result is a set of several line graphs showing the predicted and observed numbers of SARS cases and deaths. The deviation of the observed numbers from the predicted numbers indicates the success of infection control measures (49).

For the general public, an effective yet simple and basic way to convey, or translate, complex information is by using health proverbs (50). Sayings such as “an apple a day keeps the doctor away” (51) have helped convey important health messages through the years. They were created by our ancestors, and we have the responsibility to create new science-based health proverbs for future generations.

Public health practitioners can learn about knowledge translation techniques from weather forecasters (52), who use symbols (such as a sun partly covered by clouds) and maps to explain the weather. Symbols could be used to denote public health events, and the public could receive short- and long-term public health forecasts and public health alerts, complete with color-coded maps to illustrate public health problems in space and time.

Timely dissemination

Timely dissemination of information requires an ongoing information distribution mechanism. For example, 365 health indicators relevant to the general public could be developed, with one per day being discussed on the evening news (20). After the news and the weather forecasts, the reporter could discuss one of the indicators, such as air pollution during the previous 5 years and its predicted relationship to asthma in the next 3 years. The public would not be expected to watch the news without fail, but if the information dissemination occurred daily, the public’s awareness and knowledge would increase with time (53).

In Canada, approximately 167,456 deaths result from chronic diseases each year. A chronic disease clock was developed by the Public Health Agency of Canada to disseminate information in real time on its Web site (54). The chronic disease clock is a digital clock with two categories: chronic-disease–related deaths so far this year and chronic-disease–related deaths so far today (as of 12:00 midnight). People can actually watch the number of deaths attributable to chronic disease increase every few minutes because one death occurs every 3 minutes in Canada. The clock keeps running 24 hours per day, 365 days per year.

Modulated release

The general public is overwhelmed by health information. The end result is that they do nothing to improve their health because they do not know how to begin the process. The various types of available information need to be prioritized and disseminated in stages.

For example, the World Health Organization’s The World Health Report contains an immense amount of information (55). Chapter 4 of The World Health Report 2002 is about major health risks. In industrialized countries, the leading risk factors for chronic diseases are tobacco use, high blood pressure, excessive alcohol consumption, high cholesterol, overweight, low fruit and vegetable intake, and physical inactivity. The four major chronic diseases in terms of resulting disability are cardiovascular disease, cancer, chronic respiratory diseases, and neuropsychiatric disorders (55). The information in the chapter can be prioritized for modulated release (56) in three steps. To promote health, the public is told to play it SAFE (with the acronym SAFE representing smoking, alcohol, food, and exercise) — refrain from smoking, drink alcohol in moderation, eat a balanced diet, and increase physical activity. If they do not play it SAFE, they have to call a COP to assess the situation (with COP representing cholesterol, obesity, and pressure) — go for annual medical examinations to assess blood cholesterol levels, weight, and blood pressure. If they do not play it SAFE and call a COP, they have to expect HARM (with HARM representing heart disease, abnormal growth, respiratory disease, and mental disorders) — in the form of chronic conditions such as heart disease, cancer, lung disease, and mental disorders. They would then have to seek treatment. SAFE–COP–HARM concisely summarizes the important information (56).

Back to top

Knowledge Uptake

Accessible information

Scientific findings must be published in accessible formats. For example, information posted on a Web site may be considered accessible; however, some people do not have access to the Internet. Even people who do have Internet access may have difficulty retrieving a specific piece of information. For example, a Google search of the Internet using the key words health information resulted in 13,200,000 Web sites (53).

Various unique information dissemination tools have been invented. For example, executives at Xerox’s Palo Alto Research Center (Palo Alto, Calif) can monitor the company’s overall share price by watching an office fountain. The water flow is controlled through an Ethernet connection to a computer that has the latest stock data. Flow strengthens when the price increases (57).

New ways to actively market information and make it accessible to various populations are needed. A group of experts at an occupational health workshop for Latin Americans suggested unique ideas such as writing folk songs for the radio on the health effects of pesticides and organizing concerts with themes related to healthy living (58). The Brazilian Ministry of Health distributes a free package of two decks of playing cards, and one health message is written on each card, for a total of 104 health messages. Messages include tips such as “Take a walk with your dog for 30 minutes to burn up to 200 calories” and “Increase your fruit and vegetable consumption to five times a day.” Other ways to make information accessible include incorporating messages into theatrical performances and story-telling sessions (53).

Readable message

To be understood by different audiences, a message must be conveyed in relevant terms. For example, Canadian policymakers readily understand the economic and health impact of smoking on society. For them, a relevant message would be that eliminating tobacco use for 1 year in Canada would save $16.5 billion and prevent 47,000 deaths per year (59). This message may not be relevant to members of the general public who are not interested in policy and economics but are passionate about sports. Instead of telling them about how much society will save if they quit smoking, you could tell them how many important sports events, such as Stanley Cup hockey playoffs, World Cup international soccer games, or Super Bowl football playoffs, they would miss in their lifetime if they continued smoking (59).

For younger audiences, a relevant message such as “smoking makes you ugly” could be an innovative way to convey smoking-related information (60). Teenage smokers who do not care about the long-term health effects of tobacco smoking may be able relate to the more immediate effects on appearance, such as smoking-induced facial wrinkles (61,62) and baldness (63).

Motivated user

It is important to raise awareness of how scientific evidence can be used to make health policy decisions. The key is to create an atmosphere in which knowledge users are interested in and seeking out scientific knowledge rather than being inundated with unwanted information. Knowledge users can be motivated in many ways, and education plays an important role. Presenting facts is not enough. For example, after returning home from a doctor’s office, a colleague’s teenage son told her that his doctor told him he was obese. The boy then said that he really did not need to worry about the problem because obesity was so common. The boy had the facts but was not motivated to do anything about them.

Educating people by teaching them about the severity and consequences of a health problem helps motivate them to act. For example, obese people need to know that they have a higher risk of developing chronic diseases such as diabetes and heart disease. Knowing the number of people who became blind or had limbs amputated because of diabetes would be a better way to drive home the ramifications of diabetes than simply stating the number of people who had diabetes.

Rewarding outcome

Policymakers and the general public must be convinced that using science for making health decisions will be beneficial and have a noticeable impact on their health — in other words, that it will have a rewarding outcome. For example, mathematical prediction models could help policymakers evaluate how various policies will affect a particular situation. To help show the general public how scientific evidence can be used to make health decisions and improve their health, computer software could be used to calculate the probability of disease risks or overall health outcomes based on input related to personal lifestyle choices, demographics, diet, and smoking (20). For example, a 20-year-old man in excellent health may find out that he is expected to live 75 years. The computer program could be used to show him that if he were to start smoking, he would only be expected to live 67 years (64). The 8-year difference may be rewarding enough for him to decide not to start smoking.

Back to top

Conclusion

The science-based policy framework of knowledge generation, knowledge exchange, and knowledge uptake has similarities to Boyer’s research (65). Boyer studied the concept of scholarship and distinguished four kinds of scholarly pursuits: discovery, integration, application, and teaching (65). Many parallels exist between Boyer’s work and the framework described in this article: Boyer’s discovery category parallels the framework’s knowledge generation area, his integration category parallels the knowledge exchange area, and his application category parallels the knowledge uptake area. Overall, education is important in all three areas of the framework.

Corresponding to the 12 essentials are 12 recommendations for the future (Table 2). It is hoped that these recommendations will stimulate additional research and provide evidence for the necessity of a strong evidence base in public health policy.

Back to top

Acknowledgments

Views presented in this article are those of the author and cannot be attributed to the Public Health Agency of Canada, the University of Ottawa, or the University of Toronto. The article is based on an invited presentation at the Evidence-based Decision Making seminar on January 27, 2004, jointly sponsored by Health Canada and the University of Ottawa, Ottawa, Ontario; an invited presentation at Johns Hopkins University on September 17, 2004, Baltimore, Md; a seminar at the University of Toronto on February 24, 2005, Toronto, Ontario; and a seminar at the University of British Columbia on May 13, 2005, Vancouver, British Columbia.

Back to top

Author Information

Corresponding Author: Bernard C.K. Choi, PhD, MSc, Centre for Chronic Disease Prevention and Control, Public Health Agency of Canada, PL 6701A, 120 Colonnade Rd, Ottawa, Ontario, Canada K1A 1B4. Telephone: 613-957-1074. E-mail: Bernard_Choi@phac-aspc.gc.ca.

Author Affiliations: Dr Choi is also affiliated with the Department of Epidemiology and Community Medicine, University of Ottawa, Ottawa, Ontario, and the Department of Public Health Sciences, University of Toronto, Toronto, Ontario.

Back to top

References

  1. Tang KC, Ehsani JP, McQueen DV. Evidence based health promotion: recollections, reflections, and reconsiderations. J Epidemiol Community Health 2003;57:841-3.
  2. Scheel IB, Hagen KB, Oxman AD. The unbearable lightness of healthcare policy making: a description of a process aimed at giving it some weight. J Epidemiol Community Health 2003;57:483-7.
  3. Merriam-Webster Dictionary On-Line [database online] Springfield (MA): Merriam-Webster, Inc [cited 2004 Oct 20]. Available from: URL: http://www.m-w.com/cgi-bin/dictionary*.
  4. Haynes B, Haines A. Barriers and bridges to evidence based clinical practice. BMJ 1998;317:273-6.
  5. CAB International. Knowledge for development. Oxfordshire (UK): CAB International; 2004 [cited 2005 Mar 3]. Available from: URL: http://www.cabibioscience.org/Html/K4DYellow.htm*.
  6. Centre for Global eHealth Innovation. The Global eHealth Innovation Network.  Ontario, Canada: Centre for Global eHealth Innovation; 2005 [cited 2005 Mar 3]. Available from: URL: http://www.ehealthinnovation.org/splash/ehealthflash*.
  7. Knowledge acquisition, dissemination and application [cited 2005 Mar 3]. Available from: URL: http://www.ilabsgroup.com/rcenter/Knowledge%20 Acquisition%20Dissemination%20and%20Application.doc*.
  8. The Knowledge Management Forum. What is knowledge management [Internet] [cited 2005 Mar 3]? Available from: URL: http://www.km-forum.org/what_is.htm*.
  9. Lavis JN. Enhancing the uptake of research knowledge in health policy. Presentation slides; 2004 May 6 [cited 2005 Mar 3]. Available from: URL: http://www.researchtopolicy.ca/presentations/ slides_39_kte_lse_2004-05-06.pdf*.
  10. Canadian Health Services Research Foundation. Knowledge brokering.   Ottawa, Canada: Canadian Health Services Research Foundation; 2004 [cited 2005 Mar 3]. Available from: URL: http://www.chsrf.ca/keys/use_knowledge_e.php*.
  11. Institute for Work and Health. Knowledge transfer and exchange. Toronto, Canada: Institute for Work and Health; 2004 [cited 2005 Mar 3]. Available from: URL: http://www.iwh.on.ca/kte/kte.php*.
  12. National Center for the Dissemination of Disability Research. NIDRR’s long range plan — knowledge dissemination and utilization.  Austin (TX): National Center for the Dissemination of Disability Research; 2001 [cited 2005 Mar 3]. Available from: URL: http://www.ncddr.org/relativeact/kdu/lrp_ov.html*.
  13. Canadian Institutes of Health Research. Knowledge translation overview. Ottawa, Canada: Canadian Institutes of Health Research; 2004 [cited 2005 Mar 3]. Available from: URL: http://www.cihr-irsc.gc.ca/e/7518.html*.
  14. Health Canada. Getting the right knowledge to the right people at the right time: an invitational roundtable on knowledge transfer.  Ottawa, Canada: Health Canada; 2004 [cited 2004 Aug 6]. Available from: URL: http://www.hc-sc.gc.ca/hpfb-dgpsa/nhpd-dpsn/get_right_ knowledge_roundtable_e.pdf*.
  15. Choi BCK, Pak AWP, Leake JL. Epidemiologic design options for health risk studies on dental amalgam. Can J Community Dentistry 1996;11:7-12.
  16. Beaglehole R, Bonita R, Kjellstrom T. Basic epidemiology. Geneva, Switzerland: World Health Organization; 1993.
  17. Corbin SB. Concepts of modern risk assessment and management. J Am College Dentists 1994;61:17-25.
  18. Centre for Evidence Based Medicine. Levels of evidence and grades of recommendation, 2001 May. Oxford, England: Centre for Evidence Based Medicine; 2001 [cited 2004 Oct 22]. Available from: URL: http://www.cebm.net/levels_of_evidence.asp*.
  19. Harvard University. Health insight – take charge of health information.  Boston (MA): Harvard University; 2004 [cited 2004 Oct 20]. Available from: URL: http://www.health-insight.harvard.edu/*.
  20. Choi BCK. Perspectives on epidemiologic surveillance in the 21st century. Chron Dis Canada 1998;19:145-51.
  21. Microwave News. Mobile phone — brain tumor risk in the limelight again: focus on Swedish and U.S. epi studies. New York: Microwave News; 1999 May/Jun [cited 2004 Oct 20]. Available from: URL: http://www.microwavenews.com/m-j99vws.pdf*.
  22. Choi BCK, Pak AWP. Bias, overview. In: Armitage P, Colton T, editors. Encyclopaedia of biostatistics. 1st ed. Chichester, Sussex (UK): John Wiley & Sons Ltd; 1998. p. 331-8.
  23. Western Kentucky University. Course BIOL 483. Lecture on “Hypothesis Testing, Type I and Type II Errors.” Bowling Green (KY): Western Kentucky University [cited 2004 Oct 20]. Available from: URL:  http://bioweb.wku.edu/courses/Biol483/483lects3.htm*.
  24. Last JM. A dictionary of epidemiology. New York (NY): Oxford University Press; 2001.
  25. Choi BCK, Jokovic A, Kay EJ, Main PA, Leake JL. Reducing variability in treatment decision making: effectiveness of educating clinicians about uncertainty. Medical Educ 1998;32(1):105-11.
  26. Choi BCK, de Harven E, Bailey DJ, Dube ID, Pantalony D. Cell type identification in leukemia: the level of agreement among four independent diagnostic methods. Cancer Detect Prev 1994;18(5):383-91.
  27. Choi BCK, Pak AWP. A catalog of biases in questionnaires. Prev Chronic Dis [serial online] 2005 Jan.
  28. Choi BCK, de Guia N, Walsh P. Look before you leap: stratify before you standardize. Am J Epidemiol 1999;149(12):1087-96.
  29. Rothman K, Greenland S. Modern epidemiology. Hagerstown (MD): Lippincott-Raven; 1998.
  30. Zheng G, Choi BCK, Yu X, Zou R, Shao Y, Ma X. Mass screening for rectal neoplasm in Jiashan County, China. J Clin Epidemiol 1991;44:1379-85.
  31. Ransohoff DF, Feinstein AR. Problems of spectrum and bias in evaluating the efficacy of diagnostic tests. N Engl J Med 1978;299(17):926-30.
  32. Choi BCK. Sensitivity and specificity of a single diagnostic test in the presence of work-up bias. J Clin Epidemiol 1992;45:581-6.
  33. Klassen TP, JadadAR, Moher D. Guides for reading and interpreting systematic reviews. Arch Pediatr Adolesc Med 1998;152:700-4.
  34. Davidoff F, Haynes B, Sackett D, Smith R. Evidence-based medicine. BMJ 1995;310:1085-8.
  35. Choi BCK, McQueen DV, Rootman I. Bridging the gap between scientists and decision makers. J Epidemiol Community Health 2003;57:918.
  36. Egger M, Smith GD. Meta-analysis. Potentials and promise. BMJ 1997;315:1371-4.
  37. Delaney BC, Hyde CJ, McManus RJ, Wilson S, Fitzmaurice DA, Jowett S, et al.  Systematic review of near patient test evaluations in primary care. BMJ 1999;319:824-7.
  38. Morrison DS, Petticrew M, Thomson H. What are the most effective ways of improving population health through transport interventions? Evidence from systematic reviews. J Epidemiol Community Health 2003;57:327-33.
  39. Egger M, Smith GD, Phillips AN. Meta-analysis: principles and procedures. BMJ 1997;315:1533-7.
  40. Froese R, Pauly D, editors. FishBase. World Wide Web electronic publication. Glossary Searched Term: Metadatabase [cited 2004 Oct 22]. Available from: URL: http://www.fishbase.org/Glossary/ Glossary.cfm?TermEnglish=metadatabase*.
  41. F/P/T Committee of Officials for the Ministers Responsible for Seniors. An inventory of Canadian programs for the prevention of falls among seniors living in the community.  Ottawa, Canada: Health Canada; 2001 [cited 2004 Oct 22]. Available from: URL: http://www.hc-sc.gc.ca/seniors-aines/pubs/ inventory/pdf/Inventory_e.pdf*.
  42. Hemmings J, Wilkinson J. What is a public health observatory? J Epidemiol Community Health 2003;57:324-6.
  43. Liverpool Public Health Observatory. Liverpool, England: The University of Liverpool; 2004 [cited 2004 Oct 22]. Available from: URL: http://www.liv.ac.uk/PublicHealth/obs/LPHO.htm*.
  44. Campbell NRC, Burgess E, Choi BCK, Taylor G, Wilson E, Cléroux J, et al. Lifestyle modifications to prevent and control hypertension: Can Med Assoc J 1999;160(9 suppl):S1-6.
  45. Choi BCK, Pak AWP. Lessons for surveillance in the 21st century: a historical perspective from the past 5 millennia. Soz Praventivmed 2001;46:361-8.
  46. Choi BCK, Orlova A, Marsh M, Issa N, Morrison H. Two information dissemination approaches for public health decision-makers: encyclopaedia and fire alarm. J Epidemiol Community Health 2004;58:634.
  47. Choi BCK, Pak AWP, Ottoson JM. Understanding the basic concepts of public health surveillance. J Epidemiol Community Health 2002;56:402.
  48. Choi BCK. Future challenges for diagnostic research: striking a balance between simplicity and complexity. J Epidemiol Community Health 2002;56:334-5.
  49. Choi BCK, Pak AWP. A simple approximate mathematical model to predict the number of severe acute respiratory syndrome cases and deaths. J Epidemiol Community Health 2003;57:831-5.
  50. Choi BCK, Pak AWP, Choi JCL, Choi ECL. Health proverbs. J Epidemiol Community Health 2004;58:1010.
  51. Simpson J. The concise oxford dictionary of proverbs. New York (NY): Oxford University Press; 1992.
  52. Choi BCK. Public health practitioners can learn from the weather forecasters. J Epidemiol Community Health 2004;58:450.
  53. Choi BCK. Innovative ideas needed for timely and effective information dissemination. J Epidemiol Community Health 2005;59:259.
  54. Centre for Chronic Disease Prevention and Control, Public Health Agency of Canada. Chronic disease clock. Ottawa, Ontario: Centre for Chronic Disease Prevention and Control [cited 2004 Nov 16]. Available from: URL: http://www.phac-aspc.gc.ca/ccdpc-cpcmc/index_e.html*.
  55. World Health Organization. The World Health Report 2002.  Geneva, Switzerland: World Health Organization; 2002 [cited 2004 Oct 22]. Available from: URL: http://www.who.int/whr/2002/en/*.
  56. Choi BCK. Modulated release of health risk information to the general public with the use of mnemonics. J Epidemiol Community Health 2004;58:809.
  57. National Post. Sowing PARC’s next revolution: Xerox squandered the 1970s innovations of its Palo Alto Research Center, but hopes to make amends. Ontario, Canada: CanWest Interactive Inc; 1999 Apr 21. p. C13. Available from: URL: http://financialpost.informart.ca/ar/ar_form.php*. 
  58. Choi BCK, Eijkemans GJM, Tennassee LM. Prioritization of occupational sentinel health events for workplace health and hazard surveillance: the Pan American Health Organization experience. J Occup Environ Med  2001;43:147-57.
  59. Choi BCK. Understanding the basic principles of knowledge translation. J Epidemiol Community Health 2005;59:93.
  60. Canny AM, Goodrich TW. Smoking makes you ugly – an innovative approach to smoking cessation. AORN J 2001;74:722-5.
  61. Koh JS, Kang H, Choi SW, Kim HO. Cigarette smoking associated with premature facial wrinkling: image analysis of facial skin replicas. Int J Dermatol 2002;41:21-7.
  62. Goldstein N. Tobacco isolated as a cause of skin aging still another reason to quit smoking! Hawaii Med J 2002;61:272.
  63. Trueb RM. Association between smoking and hair loss: another opportunity for health education against smoking? Dermatology 2003;206:189-91.
  64. Insurance Toronto. Life Expectancy Calculator. [Internet] Ontario, Canada: Insurance Toronto [cited 2004 Oct 20]. Available from: URL: http://www.insurancetoronto.com/calculators/ html*.
  65. Boyer EL. Scholarship reconsidered: priorities of the professorate. Princeton (NJ): The Carnegie Foundation for the Advancement of Teaching; 1990.

Back to top

*URLs for nonfederal organizations are provided solely as a service to our users. URLs do not constitute an endorsement of any organization by CDC or the federal government, and none should be inferred. CDC is not responsible for the content of Web pages found at these URLs.

 



Tables

Return to your place in the textTable 1. Three Areas and Twelve Essentials of Science-based Policy
Knowledge Generation Knowledge Exchange Knowledge Uptake
  1. Credible design
  2. Accurate data
  3. Sound analysis
  4. Comprehensive synthesis
  1. Relevant content
  2. Appropriate translation
  3. Timely dissemination
  4. Modulated release
  1. Accessible information
  2. Readable message
  3. Motivated user
  4. Rewarding outcome
Return to your place in the textTable 2. Twelve Recommendations for the Future of Science-based Policy
Area Essential Recommendation
Knowledge generation Credible design Use high-quality study designs and apply a systematic approach in research to prevent the false-positive research cycle.
Accurate data Apply existing methods and develop new methods for reducing bias and increasing data accuracy obtained from scientific research.
Sound analysis Apply sound analysis methods to produce high-quality results from scientific research.
Comprehensive synthesis Use existing tools and develop new tools for summarizing scientific findings.
Knowledge exchange Relevant content Apply existing methods and develop new methods to extract relevant content from existing information.
Appropriate translation Develop new techniques for information translation, and simplify the science–user interface.
Timely dissemination Develop innovative ways to disseminate information in a timely way.
Modulated release Create new methods for organizing the release of prioritized information.
Knowledge uptake Accessible information Invent new ways to market health information and make it more accessible.
Readable message Produce information in a readable, understandable format that is relevant to the audience.
Motivated user Educate and motivate policymakers so that they actively seek out scientific evidence to make decisions.
Rewarding outcome Develop new ways to effectively show how using science to make decisions is beneficial.

Back to top

 



 



The opinions expressed by authors contributing to this journal do not necessarily reflect the opinions of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors’ affiliated institutions. Use of trade names is for identification only and does not imply endorsement by any of the groups named above.


 Home 

Privacy Policy | Accessibility

CDC Home | Search | Health Topics A-Z

This page last reviewed March 30, 2012

Centers for Disease Control and Prevention
National Center for Chronic Disease Prevention and Health Promotion
 HHS logoUnited States Department of
Health and Human Services