THE PERSONAL PREDICAMENT OF PUBLIC HEALTH



about Epidemiology & the department

Epidemiology academic information

Epidemiology faculty

Epidemilogy resources

sites of interest to Epidemiology professionals



Last Updated

30 Jun 2003

Source: The Chronicle of Higher Education, June 27, 2003

THE CHRONICLE REVIEW

The Personal Predicament of Public Health

By JANE S. SMITH

Jane S. Smith is a writer and adjunct professor of history and of preventive medicine at Northwestern University. Her books include Patenting the Sun: Polio and the Salk Vaccine (William Morrow, 1990).

These are the glory days of public-health awareness. Officials speak of "public-health infrastructure" and "first-line responders" with an unprecedented confidence that their listeners will know what they mean, and care. Concerned citizens -- and we are all concerned citizens now -- follow the front-page diagrams charting the spread of SARS, with the same worried diligence once applied to maps showing the possible trajectories of enemy missiles.

But how much has the public perception of public-health programs really changed? Diseases that once might have taken years or even decades to travel from their location of origin now can appear within days. Modern techniques of genetic sequencing led to an astonishingly fast identification of the SARS virus, and electronic communication made it possible to act swiftly and globally against the newly discovered infectious agent as never before. Still, does the accelerated pace at which severe acute respiratory syndrome was discovered, spread, and combated indicate a real change in responses to public-health crises, or merely a shorter timeline?

Watching the story of SARS as it emerged day by day, I thought about a talk I gave several years ago to the department of preventive medicine at my university's medical school. Describing what I called "The Public View of Public Health," I pointed out that most people outside the profession rarely think about public health at all, barely understand what it is, and, when forced to apply the mandates of public-health agencies to their own lives, frequently view those mandates as coercive, biased, dangerous, and otherwise not at all in their own best interests.

Needless to say, that was not the view my audience held of its work. People who devote their careers to bringing the benefits of sanitation, medication, vaccination, prenatal screening, TB testing, nutrition, exercise, pure foods and drugs, occupational safety, and a host of other laudable initiatives to all segments of society, including the least advantaged, tend to assume that they are the good guys. Most members of the general population agree, in the abstract. They often change their minds, however, when a specific health policy forces them to alter private behavior, undergo personal inconvenience, sustain a financial loss, or endure a known risk (however small) to gain an uncertain benefit (however large).

In my talk, I cited quite a few recent examples of contemporary resistance to public-health mandates, from military personnel suing to avoid compulsory smallpox inoculation to AIDS patients refusing to identify sexual partners who could then be notified of exposure. I described some of the historic sources of reluctance to comply with even the most elementary principles of disease prevention. The first problem, ironically, was a product of earlier successes, I said. A population coming to adulthood after the near-eradication of epidemic diseases like diphtheria and whooping cough has a different standard of risk aversion than earlier generations; people are more worried about the small chance of vaccine-associated injury than about once-dreaded diseases they have never seen.

A second problem was the legacy of past injustices. Minority populations familiar with historic abuses of research often forgo services that would benefit their health. The infamous Tuskegee Syphilis Studies, for example, which denied African-American men treatment or even information about their disease over a 40-year period, created a lasting mistrust of government programs. Yet another source of resistance to public-health programs was the culture of personal gratification. In an era that exalts self-fulfillment, many people claim the right to risk significant unnecessary injury by, for example, not using seat belts or bicycle helmets, even though society often is left bearing a large part of the social and financial costs of their behavior.

Beyond the influences of history and culture, however, I saw a larger problem: the significant gulf that exists between public-health professionals and the public they serve in understanding just what public health is. To practitioners, the best programs are continuous, inclusive, and preventive. To much of the public at large, the only really noticeable public-health programs are extraordinary reactions to a specific crisis, whether it be epidemic disease, toxic cloud, tainted food, or contaminated water.

The principal benefit of a strong public-health infrastructure, the continuing practice of preventive medicine, is invisible, since its triumphs consist of bad things that do not happen. In the boom-or-bust cycles of popular attention, it is difficult to get people excited about the day they did not develop diabetes or high blood pressure, did not drink contaminated water, did not have an underweight baby, or did not die from an accident on the job. As a result, I concluded, the emerging challenge to contemporary public-health practice would be convincing people to care about the growing list of mundane, good-for-you programs that could, indeed, save their lives, but in ways they would never notice.

Shortly after my talk, I began taking part in a program of seminars for medical students who were simultaneously studying for master's degrees in public health. My area was history, and we concentrated on a series of issues that illustrated stages in the development of public-health awareness in the Western world. Our topics ranged from biblical and medieval regulations for the treatment of lepers to 19th-century sanitary reforms, from cooperative efforts like the successful 20th-century mobilizations to eradicate polio to the highly contentious contemporary movement to treat handgun violence as a threat to public health. We explored the successes and failures of the past, the perennial interaction of politics and medicine, and the way economic and technological changes create both new crises and new solutions. We noted the recurrent history of economic favoritism or social bias that had been cloaked in seemingly scientific health campaigns in the past, and we discussed the importance of including public values and community voices in the planning for any campaigns of the future.

"Remember," I said, probably far more often than necessary, "when people see the public-health agent entering their home, closing their business, recommending a vaccination that makes the baby 'just a little sick,' or urging their parents to forgo traditional cures and take an unfamiliar medicine, their first impulse is not always to embrace that agent as their new best friend."

Then came the series of extraordinary public-health alarms of the past two years. Terrorism. Bioterrorism. Anthrax. Smallpox. Now SARS. It would be absurd, in 2003, to insist that a central challenge of public-health administration is to persuade people of the need for programs to protect society's physical well-being. Now, the issue seems to be how fast we can meet the demand for public-health services in response to concerns as varied as post-traumatic stress disorder and maximum security for municipal water supplies, how to finance increased epidemic surveillance without abandoning existing health programs, and how, somehow, to maintain a state of high vigilance while simultaneously remembering that diet and exercise are just as important to personal health as duct tape and a pocket stash of Cipro.

Even as public-health officials everywhere bewailed their limited budgets and increased responsibilities, they were pleased that the importance of their work was finally being recognized. In December 2001, The Wall Street Journal reported a "national shift in priorities -- toward greater community safety and away from personal freedom and the economic good above all else." In January 2003, Robert F. Meenan, dean of the Boston University School of Public Health, recalled the anthrax attacks as "a marketing campaign we could never have bought," and told The Boston Globe that "it changed people's perspective about public health."

But have the challenges of the past two years really brought about a new era of good feeling about public health, a time when public groups and private individuals work together to improve the health and safety of the community?

Not quite. Responses to real and potential threats reveal as clearly as ever that public perceptions differ in significant and predictable ways from official assessments of risk. Even after authorities deemed air quality "acceptable" at ground zero after 9/11, local New York residents bought filters and gas masks or moved away. As government agencies grappled with the terrors of letter-borne anthrax attacks, employees of the Postal Service wondered why Congressional offices were evacuated before mail-processing plants were. And, as the SARS epidemic unfolds, there is an unfortunate familiarity to tensions between public and personal responses, a pattern of mutual mistrust and resistance that is surprisingly consistent across times, cultures, and political systems.

What is that pattern? On the official side, initial cases are ignored or misdiagnosed. By the time medical personnel begin recognizing the presence of an unusual outbreak, those in power try to suppress the information, more fearful of economic downturn, mass panic, or loss of authority than of epidemic disease. Local business and political leaders insist that there is no problem, or that the problem is being overblown, and make a great show of their own willingness to enter some area that has been feared unsafe. Then the doctors and nurses start dying, the public demands information, and the authorities begin to act by cracking down on the poor, the powerless, and the convenient. Stray dogs are killed. Students are quarantined and foreigners detained.

Meanwhile, individuals seek their own remedies. Children continue their games but now wear protective clothing or face masks. Nontraditional cures become hot sellers. Some people panic, while others bewail the diversion of attention and resources from more widespread but less sensational threats to community health. Many who never fall ill still find their plans canceled and their lives rearranged by the emergency. Those who really are a danger to others don't know they are sick, or don't think they could possibly infect anyone else, until it is too late. Specific public-health problems vary greatly, of course. Few people today would support elaborate and expensive confinement of lepers or seek royal amulets to protect them from scrofula, just as no one in the rapidly receding past millennium worried about SARS. But if they had, they probably would have reacted in a way we would recognize.

Recent events in Hong Kong, in Beijing, and in Toronto -- three very different urban centers -- show the familiar historic syndrome of denial, anger, flight, blame, false assurances, unequal enforcement, and eventual action. On April 24, the World Health Organization urged travelers to avoid Toronto, outraging Canadians by suggesting that it was risky to visit this modern city, with its state-of-the-art medical capacities and its heavy economic reliance on conventions and tourism. Restrictions on travel and trade, however prudent, always put an unwelcome financial burden on the local economy, and the protests from Toronto sounded very much like those that arose against quarantines from the merchant quarter of every city of Europe during the plague years that started in the 14th century. Photographs of Chinese villagers in Guchang, 10 miles north of Beijing, constructing roadblocks and putting up signs barring "people and cars not from this village" recall photographs of similar signs and roadblocks designed to keep New York City children away from nearby suburbs during the polio epidemic of 1916.

In another city outside Beijing, residents rioted at officials' attempts to convert a school building into a quarantine hospital for suspected victims of SARS, reacting precisely as Americans had during the cholera epidemics of 1832, when, according to the historian of medicine Charles Rosenberg, "neighbors resorted to everything from humble petitions to arson in their efforts to have [the hospitals] removed." In Taiwan, where the libertarian culture of democracy was ruefully blamed for local disregard of quarantine orders, a former deputy ambassador to the United States explained to The New York Times, "Everyone in Taiwan thinks he's special and smart -- why should he observe the rules?"

But a sense of personal exemption from both regulations and dangers is hardly unique to modern democratic societies. In the fall of 1799, when the Spanish colony in Mexico City was ravaged by smallpox while people avoided the new inoculation hospital, the doctor in charge explained that "the innate repugnance of those who were naturally healthy to voluntarily contract a sickness by artificial means, as well as their hopeful expectations that they might avoid being among those who would be infected, all of this served to persuade the people they need not be inoculated."

By the time a new class of students reconvenes next year to consider the history of public health, SARS will still be on the watch list of public-health officials but will probably be off the front pages of the local news. New outbreaks will occur, perhaps as often as outbreaks of influenza, but what had been a terrifying new epidemic will become a chronic, serious disease. Increased surveillance and advanced biomedical knowledge, combined with the classic remedies of time and luck, eventually will bring this latest emerging epidemic under control, though only after it has exacted an enormous toll, both in lives lost and in activities and economies disrupted. But as the crisis wanes, as we hope and trust it will, the perennial and universal tensions of any broadly based health initiative should be remembered.

Public-health practice is the product of a constant, shifting tug of war between individual liberty and general welfare, between voluntary compliance and coercion, and between solutions that are personal and those that are, in a favorite term of the public-health profession, population-based. Sometimes the balance of practice tips toward individual self-determination, as in the widespread closing in the late 20th century of contagious-disease hospitals. Sometimes it tips the other way, toward quarantines, isolation, compulsory vaccination, and the laborious but effective strategy of visiting patients every day to ensure that they take the medications that keep them from being a public risk. There are occasions, thankfully rare, when a sudden catastrophic event like the attacks of September 11 persuades entire communities or nations to give up a good number of their civil liberties in the pursuit of mass protection. But never completely, and never for long.

With their studies in biostatistics and epidemiology, then, people preparing for careers in public health need to learn that their discipline is as much about politics as medicine, a fact of life that is never clearer than in the imperfect compromises that are often required. Quarantines can never be enforced perfectly (imagine the difficulty of shooting to kill when the violator is a kindergartner trying to run home), but sometimes they are the best option to prevent the spread of an easily communicated disease.

Voluntary compliance is always wanted, but sometimes legal compulsion is needed, too, as in the recent forced isolation of a suspected SARS patient in New York City, who refused to delay his travel plans until made to do so. Public-health law is rich in cases seeking to determine, one case at a time, the shifting line where individual freedom must yield to community protection, and the other, even blurrier line that marks the unfair, unnecessary, or excessive use of legal powers. But no matter what the issue, whether smoking bans in restaurants or travel restrictions in epidemic areas, many people never really believe that the restrictions apply to them, personally.

Before a society can change behavior or adopt more-healthful practices, members of the general population have to be convinced of four things that are all quite difficult to accept: that a problem exists, that the problem applies to them, that a solution exists, and that the solution applies to them. Individuals can, and often do, make great personal sacrifices for a larger social good, but they must be persuaded that such sacrifices are both necessary and fair. And if they can't be persuaded in significant numbers, there is a good possibility that the program is flawed.

Even those within the medical profession act differently when they become the object of externally imposed public-health initiatives. Remember the recent national mandate to vaccinate a half-million health-care workers against smallpox in order to prepare for a possible bioterror attack? The vaccinations were supposed to be completed by the end of February 2003, but a General Accounting Office report counted only 33,444 as of April 18. Hundreds of hospitals have declined to put the program into effect, contending that the risks, and the costs, outweigh the benefits. As institutions and as individuals, "first-line responders" have crossed the divide from providers to consumers, moved by the skepticism that is often intrinsic to the public view of public health.

Thirty years ago, it was popular to say that the age of epidemics was over. The emergence of HIV, the unexpected appearance in humans of a lethal strain of West Nile virus, the arrival of SARS, and the resurgence of diseases, like tuberculosis, once thought to be under control, all show that such a belief is far from true. Meanwhile, a new age of bioterror seems to be just starting, in fear if not in fact.

Aided by the speed of Internet communications, the World Health Organization and the Centers for Disease Control and Prevention have made great advances in the sharing of information that is a fundamental part of stemming panic and creating public acceptance of public-health programs. But the extent of their successes won't be known until another outbreak strikes a new region. Spurred by fear of bioterrorism, hospitals and health agencies have created emergency plans, which were, in many cases, first tested in response to SARS.

But when a new health crisis appears, they will still face the same dilemmas that confronted ancient Athens in the time of the plague, Mexico City during smallpox epidemics of the 18th century, and New York in the polio epidemic of 1916. Public-health officers and other government officials might as well make these conflicts part of their planning, because they will not go away. Modern reports of "surprising" delays and "unexpected" resistance in putting public-health programs into effect suggest a profound, continuing, rather wistful disregard of both the basics of human nature and the dynamics of community life.

Students contemplating careers in public health explore ways to prevent hepatitis in the mentally ill, improve nutrition among immigrants, and increase access to diagnostic tests in low-income, underinsured populations. They learn to distinguish fearful but unlikely events, such as sarin attacks in the subway, from major preventable causes of current mortality, like tuberculosis, heart disease, diarrheal diseases, AIDS, traffic accidents, gunfire, and tobacco. They are trained to keep the statistical records that are an essential part of charting the health of a vast and disparate population, and to do it all in an ethical manner.

Among their other skills, they must also learn to recognize the inevitable tension between personal preference and social needs, and they must find ways to address the stubborn fact that the public perspective oscillates between indifference and panic, with little time spent in the middle ground, where the health of society is usually shaped.

In 2003, in every part of the world, directly or indirectly, ordinary people have been affected by the dangers of SARS and by the changing array of restrictions and recommendations inspired by the epidemic. Many have complied. Some have not. All have been nervous and unsure of the wisdom of their leaders, with doubts increasing in direct proportion to a person's closeness to unfolding events. Once again, when confronted by strictures imposed by public-health agencies on their own lives, a significant number of people have viewed those strictures as coercive, biased, dangerous, and otherwise not at all in their own personal best interests. We should not be surprised.