The content, links, and pdfs are no longer maintained and might be outdated.
Eradication: Lessons From the Past
Donald A. Henderson*
The declaration in 1980 that smallpox had been eradicated reawakened interest in disease eradication as a public health strategy. The smallpox programme's success derived, in part, from lessons learned from the preceding costly failure of the malaria eradication campaign. In turn, the smallpox programme offered important lessons with respect to other prospective disease control programmes, and these have been effectively applied in the two current global eradication initiatives, those against poliomyelitis and dracunculiasis. Taking this theme a step further, there are those who would now focus on the development of an inventory of diseases which might, one by one, be targeted either for eradication or elimination. This approach, while interesting, fails to recognize many of the important lessons learned and their broad implications for contemporary disease control programmes worldwide.
On 8 May 1980, the Thirty-third World Health Assembly declared that smallpox had been eradicated globally (1). For the first time in history, mankind had vanquished a disease. It must be borne in mind, however, that this was not the first attempt at global disease eradication but the fifth. Within a month, the Fogarty International Center convened a two-day meeting to explore the question of what diseases should be eradicated next (2). This was the first of a series of conferences of which the present one is the latest. At that first meeting, the list of diseases and conditions nominated ranged from urban rabies to periodontal disease to leprosy. Some spoke of eradication, others of elimination, and yet others of the elimination of a disease as a public health problem -- however that might be defined. A tumultuous discussion eventually culminated in the decision that measles, poliomyelitis and yaws were clearly suitable for at least regional eradication but that there were many other possible candidates.
One sceptical note was made at the symposium by the two introductory speakers -- Fenner & Henderson (3,4). They reflected on the broader applicability of disease eradication from their vantage point of nearly 15 years of participation in the just concluded smallpox eradication campaign. Their basic conclusion, in brief, was that there was at that time no other suitable candidate for eradication. As they pointed out, smallpox had a number of highly favourable characteristics which facilitated eradication including the very heat-stable vaccine which protected with a single dose. No other disease came close to matching these advantages. Despite this, eradication was achieved by only the narrowest of margins. Its progress in many parts of the world and at different times wavered between success and disaster, often only to be decided by quixotic circumstance or extraordinary performances by field staff. Nor was support for the programme generous, whatever the favourable cost-benefit ratios may have been. A number of endemic countries were themselves persuaded only with difficulty to participate in the programme; the industrialized countries were reluctant contributors; and, UNICEF, so helpful to the prior malaria programme, decided that it wanted nothing to do with another eradication programme and stated that it would make no contributions (1). Several countries did make donations of vaccine and the West African programme, directed by the US Communicable Disease Center was a critical addition. However, cash donations to WHO during the first 7 years of the smallpox programme, 1967-73, amounted to exactly US$ 79 500 (5). That is not per year but the total for that entire period.
Moreover, in 1980, support for any new eradication effort seemed especially unlikely since the smallpox eradication programme was then being critically maligned by traditional international health planners. To them, the smallpox campaign epitomized the worst of what they characterized as anachronistic, authoritarian, "top-down" programmes which they saw as anathema to the new "health for all" primary health care initiative (6).
Given these considerations, it seemed in 1980 to be little more than an interesting academic exercise to debate what next to eradicate. Having offered this view, Henderson was not again invited to the subsequent workshops, task forces, conferences and special committees on eradication which were later convened. Thus, in reflecting on the lessons to be learned from the yaws, malaria and smallpox campaigns, as I was requested to do, I come to the subject afresh and have had the opportunity to reconsider the question of the next steps in eradication, based on a further 17 years of perspective.
As a reminder, the yaws and malaria campaigns began more or less at the same time, about 1955 (7, 8), and were effectively terminated some 15 years later, in 1970 or soon thereafter. The launch of each was triggered by the introduction of a new technology -- an injectable single-dose long-acting penicillin, for the treatment of yaws, and the availability of large quantities of the inexpensive insecticide DDT, for use in the malaria programme. Surprisingly, prior to the launch, neither campaign could draw on the experience of large-scale pilot programmes in critical areas which would have served to demonstrate the feasibility of eradication, given the tools and resources available. If they had, neither programme would have been initiated. The existence of such prior experience would seem to be axiomatic before deciding on any eradication initiative. Yet, even the Dahlem Conference's otherwise commendable review of lessons provided by past eradication programmes effectively overlooks this fundamental precept (9).
Of the two programmes, malaria was, by far, the most important and during its 15 years of existence, it accounted for more than one-third of WHO's total expenditures and its 500-person WHO staff dwarfed all other programmes. The USA alone contributed nearly a thousand million dollars to the effort (10). The yaws campaign, in contrast, was much more modest, was little publicized, and was little known.
The strategy of the yaws programme called for the screening of patients for clinical disease and their treatment with penicillin. In all, some 160 million persons were examined and 50 million were treated in 46 countries. Besides having failed to validate the strategy in pilot studies, the programme had two glaring deficiencies. First was the fact that, for the first 10 years of its history, there was no surveillance and so it was not clear as to what was actually happening. When sample serological surveys were eventually conducted, it was discovered immediately that subclinical infections were far more prevalent than had been recognized, making eradication quite impossible. Second, there was no programme of research and thus no operational studies which might have demonstrated far earlier the futility of this exercise.
Unlike the little known yaws programme, the malaria campaign, during its existence, dominated the international health agenda (11-13). This programme was active in many countries in Latin America and South Asia as well as Ethiopia, and consumed a substantial proportion of national health expenditures as well as major inputs from WHO and USAID. The programme failed, but lessons derived from malaria eradication were central in shaping the smallpox eradication strategy. Three operating principles were of particular importance. First was the relationship of the programme itself to the health services. It was a tenet of the malaria eradication directorate that the programme could not be successful unless it had full support from the highest level of government. This translated into a demand that the director of the programme in each country report directly to the head of government and that the malaria service function as an independent, autonomous entity with its own personnel and its own pay scales. Involvement of the community at large or of persons at the community level was not part of the overall strategy.
Second, all malaria programmes were obliged to adhere rigidly to a highly detailed, standard manual of operations. It mandated, for example, identical job descriptions in every country and even prescribed specific charts to be displayed on each office wall at each administrative level. The programme was conceived and executed as a military operation to be conducted in an identical manner whatever the battlefield. Third, the premise of the programme was that the needed technology was available and that success depended solely on meticulous attention to administrative detail in implementing the effort. Accordingly, research was considered unnecessary and was effectively suspended from the launch of the programme.
The smallpox eradication campaign had to function differently. Segregating it as an autonomous entity reporting to the head of state was neither politically acceptable nor financially feasible. With a programme budget of only US$ 2.4 million per year, there was no hope of underwriting more than a small proportion of personnel and programme costs. The programme necessarily had to function within existing health service structures and had to take advantage of available resources. This, in fact, proved advantageous, as contrary to commonly held belief, underutilized health personnel were abundant in most countries. With motivation and direction, most performed well. It was also discovered that those in the community such as teachers, religious leaders and village elders, could and did make invaluable contributions. Rigid manuals of operations intuitively made little sense given the diverse nature of national health structures and so broad goals with provision for flexibility in achieving them became the accepted mode.
Finally, research initiatives were encouraged at every level. This occurred despite the opposition of senior WHO leadership who insisted that the tools were in hand and the epidemiology was sufficiently well understood and that better management was all that was necessary to eradicate smallpox. Research initiatives included the development of new vaccination devices to replace traditional lancets; field studies, which revealed the epidemiology of the disease to be different from that described in the textbooks and, in consequence, the need for modification of basic operations; the discovery that the duration of vaccine efficacy was far longer than that normally stated, making revaccination much less important; operational research, which facilitated more efficient vaccine delivery and case detection; and studies which demonstrated conclusively that there was no animal reservoir. The principle was to ask again and again, how could this programme be made to operate more efficiently, more effectively. And, indeed, without the fruits of these research efforts, it is highly unlikely that eradication would have succeeded. Even as the last cases were being discovered, a joint Dutch-Indonesian study of a new tissue-culture vaccine was just being completed (14,15). We hoped we would not require it, but we were prepared, should it be needed.
From the beginning of the programme, surveillance for smallpox cases was a basic strategy of the campaign. As expected, it proved to be the ultimate quality control measure, the guide to improved operations, and the yardstick of progress. These principles for conduct of an eradication programme remain valid today and, as applied in guinea-worm eradication (16) and in poliomyelitis eradication in the Americas (17) and western Asia, have proved eminently successful.
One might imagine that the subject of which diseases might next be eradicated would have been a primary topic of conversation among the large and talented group of epidemiologists who, through the late 1970s, were engaged in eradicating smallpox. In fact, I can't recall the question ever having been seriously raised or discussed. Actually, the question didn't seem especially relevant. This is not to say that we regarded the eradication of smallpox as an end in itself. Far from it.
At the time the smallpox eradication programme began, only two vaccines -- BCG and smallpox -- were at all widely used throughout the developing world. Few countries had organized national vaccination programmes and those that did, seldom extended much beyond the larger towns and cities; substandard and/or poorly preserved vaccines were in common use; information about disease incidence was woefully inadequate, and effective supervision was generally poor to nil.
Conceptually, as we envisaged it, an effective campaign required the development of a management structure extending from the capital city to the furthest villages; it required that mechanisms be established to assure that fully potent and stable vaccine was used; and that plans be implemented within the existing health service structure to assure its distribution throughout the country to reach at least 80% of the inhabitants. It demanded that a national surveillance system be established, which was at that time an unknown entity in most countries; and it required that planning be done and goals established to reach a finite end-point within a given period. Most national health ministries had never before attempted an effort of this type. It seemed to us that a successful programme would provide valuable training and experience for health service staff and, most important, would create a skeleton framework permitting other activities to be added. Additional vaccines were obviously a logical further step.
In some countries, the simultaneous vaccination with two antigens began soon after the beginning of the programme. In the 20 countries of western and central Africa assisted by CDC, all countries administered smallpox and measles vaccines; in a number of countries of eastern Africa, BCG and smallpox vaccine began to be administered at the same time; and in some countries at special risk, yellow fever vaccine was also added. Few developing countries, however, provided DPT, measles or poliovirus vaccine.
With expansion of the immunization programme in mind, WHO organized, in 1970, an international meeting to review the status of vaccination internationally and to recommend model programmes (18). Recommended for general use in the developing countries were smallpox, BCG, DPT, measles and typhoid vaccines. Yellow fever and poliovirus vaccines were recommended for use but only under special circumstances. At that time, poliovirus vaccine was not generally recommended because of uncertainty as to how serious a problem poliomyelitis really was for most developing countries and because of doubts as to how efficacious poliovirus vaccine would prove to be in tropical areas. In 1974, this expanded programme of immunization was approved by the World Health Assembly; in 1977, programme leadership was strengthened and the programme began to grow (19). By then, typhoid vaccine had been dropped from the recommended list and poliovirus vaccine was added.
From the eradication of smallpox from 31 endemic countries to the implementation of effective immunization programmes for six diseases in more than 100 countries represents an enormous increase in programme complexity. Nevertheless, remarkable progress has been made in expanding and intensifying immunization activities throughout the world.
In 1990, this culminated in the World Summit for Children and the nominal achievement of the goal of vaccinating 80% of the world's children against six major diseases.
One component of that programme which lagged significantly was surveillance. Not all the EPI diseases lend themselves readily to national surveillance but this did appear feasible, at least for neonatal tetanus, poliomyelitis and measles. However, persuading governments and health workers, whether national or international, that surveillance is as vital for disease control as for eradication proved to be a formidable task. In fact, until 1985, little progress was made.
At that time, Ciro de Quadros, Director of PAHO's EPI Program, visualized an approach to spur the development of national surveillance programmes in Latin America. The goal was the eradication of poliomyelitis from the Western Hemisphere. With poliomyelitis eradication having been determined to be technically feasible and, in the Americas, practicable as well, the countries of PAHO endorsed the eradication goal and, in so doing, committed themselves to the development of a hemisphere-wide surveillance effort (17). Sites reporting suspect cases each week increased from some 500 to more than 20,000. Reporting for acute flaccid paralysis was soon extended to include neonatal tetanus, measles and cholera.
During the course of poliomyelitis eradication in the Americas, new paradigms for community involvement in public health emerged as well as approaches for bringing together public and private sector agencies; national immunization days were demonstrated to be a practicable, often more efficient means for vaccine delivery; new approaches were evolved for the planning and integration of international assistance; a hemisphere-wide laboratory network was created; and new mechanisms for vaccine purchase, utilizing PAHO and UNICEF administrative channels, were established. Poliomyelitis eradication was the visible target of the programme but the agenda was far broader than this and the accomplishments likewise.
With this further background of experience, what now might I offer as lessons to the future? In contemplating this question, it is important to bear in mind that there are two diseases and only two diseases which the World Health Assembly has committed itself to eradicate -- guinea-worm disease and poliomyelitis. Guinea-worm eradication, with Don Hopkins as its brilliant and persuasive advocate and strategist, has been conducted with all due attention to surveillance, to community participation, to political commitment, and to research in shaping an evolving agenda. Despite this, it lags behind scheduled targets and it is clear that its successful conclusion will require a high degree of commitment and political skill. The outcome is not a foregone conclusion but I believe it can and will succeed.
Poliomyelitis programmes have scarcely begun in those areas of Africa and south Asia which all but thwarted global smallpox eradication. Thus, the most difficult and problematical areas and years are still ahead, with programme implementation notably hampered by its reliance on a heat-labile vaccine whose efficacy leaves much to be desired and clumsy diagnostic tools. Fortunately, however, research has begun to appear on the programme's agenda. While we all hope that the programme will be successful, there is much yet to be learned and to be applied before success can be assured.
However, an international commitment has been made and high priority must be given to meeting these goals. A failure, especially in achieving poliomyelitis eradication, could as certainly call into question the credibility of the public health profession as did the collapse of the disastrous malaria eradication effort.
As we contemplate the future, is it necessary or even desirable to restrict ourselves to the narrow question of what disease should next be eradicated or eliminated? Through implementation of the smallpox, poliomyelitis and guinea-worm programmes, innovative breakthroughs have been made in organizing large-scale nationwide campaigns; in devising new methods for approaching and mobilizing communities; in developing effective national surveillance networks and in using the data in evolving better strategies; in fostering effective and relevant research programmes to facilitate disease control; and in mobilizing support at international, national and local levels.
I see these approaches as key steps in revolutionizing and revitalizing public health. Implicit in these new approaches is the setting of measurable goals and a willingness to look at all alternative methods for achieving them without assuming, as we so often have, that every intervention, every vaccine, every drug must somehow be directed or dispensed by some sort of primary health centre. These new initiatives and new approaches are of special relevance as we endeavour to deal with tuberculosis, leprosy, and micronutrient deficiencies such as iodine and Vitamin A. Likewise, use of albendazole, ivermectin and praziquantel on a strategically targeted community-wide basis could have a profound effect on many types of symptomatic parasitic disease (20). None of these are conditions to be eradicated in our lifetimes but they are diseases in which far more substantial progress could be made than we are now making while relying primarily on one-on-one traditional curative treatment. As time progresses, it may become apparent that certain of these diseases might warrant an eradication effort or might warrant one if better tools could be made available.
In looking to the future, however, I believe it is critical that we should not be blinded to a range of new public health programme paradigms by staring too fixedly at the blinding beacon of a few eradication dreams.
* University Distinguished Service Professor, Johns Hopkins University, Baltimore, MD, USA.
Disclaimer All MMWR HTML versions of articles are electronic conversions from ASCII text into HTML. This conversion may have resulted in character translation or format errors in the HTML version. Users should not rely on this HTML document, but are referred to the electronic PDF version and/or the original MMWR paper copy for the official text, figures, and tables. An original paper copy of this issue can be obtained from the Superintendent of Documents, U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800. Contact GPO for current prices.**Questions or messages regarding errors in formatting should be addressed to firstname.lastname@example.org.
Page converted: 1/3/2000
This page last reviewed 5/2/01