31 Oct 2002
Los Angeles Times, November 4, 2001.
System Primed to Fail
WASHINGTON -- When the anthrax scarce
began a few weeks ago, the U.S. public health system was as ill-prepared for
bioterror as our armed forces were for war when the Japanese struck Pearl
Harbor. Within weeks of the attacks on the World Trade Center and the Pentagon,
our airmen and Special Forces delivered a blow to the leadership of Osama bin
Laden's terror network and Taliban supporters. But here at home, the faltering
responses and conflicting messages of health authorities have fanned fears and
may cost lives.
Why? With hindsight, it's easy to spot mistakes. Why, for example, did health
officials not realize that powder as fine as chalk dust might leak from an
envelope? Why were postal workers not tested and treated as quickly as
congressional staffers? Why were statements about the size and hazards of the
spores so inconsistent and confusing?
These criticisms, though, obscure the
larger story -- of institutions programmed to fail. For at least a half century,
our national commitment to an effective public health system has been on the
wane. In differing but parallel ways, political liberals and conservatives have
become skeptical, even hostile, toward government's role in the health sphere.
Liberals have come to see personal choice as paramount in medical matters -- and
government constraints on individuals' health-related behavior as intrusive. In
the 1960s and 1970s, activists and scholars targeted doctors' paternalism toward
patients and remade the law of health-care provision to protect patient
autonomy. Public tracking of community-wide disease troubled civil libertarians,
who feared invasions of personal privacy and stigmatization of disadvantaged
Conservatives, meanwhile, have opposed most public financing and provision of
medical services. They have cast health care as a matter of consumer choice and
pushed public policy toward deference to the medical marketplace. Conservatives
have taken a similar view of disease prevention, treating it as a personal
matter, not a public responsibility.
The unsurprising result has been an absence of political support for strong
public health programs and institutions. Instead, we have the public health
system we've "wanted" -- ill-funded, fragmented, highly respectful of personal
choice and unprepared for a nationally coordinated response to crisis.
Public Health in the Past
It wasn't always this way.
Public-health authorities in the 18th, 19th and early-20th centuries acted
decisively, on a grand scale, against population-wide health threats, including
frightening epidemics. Before the Civil War, health officers helped to plan
towns and cities with an eye toward controlling infectious disease by securing
clean water and food. Public-health authorities drained swamps to contain
mosquito-borne illnesses, and they organized the safe disposal of animal and
Americans saw these activities as vital to their security, no less so than
military force or police and fire protection. Taxpayers supported the needed
spending. Lawmakers empowered local health authorities to move robustly when
contagion threatened. Destruction of buildings, killing of infected animals and
even restraints on the movement of infected people were provided for by law and
widely accepted by citizens.
Because the hazards of contagion crossed class and racial lines, public health
measures that aided the worst-off won support from the well-off.
Mosquito-infested swamps, sick farm animals and airborne infections threatened
everyone, though the poor often lived in areas at highest risk.
The Industrial Revolution of the late-19th and early-20th centuries brought new
health dangers, from the building of factories in densely populated areas and
the crowding of poor people into slums. Filth and squalor spread disease, and
government responded. Physicians and sanitary engineers made regulatory
decisions concerning location of factories, control of poisonous substances and
other city planning matters. In proportion to other public expenditures, public
health budgets were much larger than they are today.
The U.S. commitment to public health -- and its regulatory powers -- as vital to
the pursuit of the common good persisted through two world wars. Campaigning for
the presidency in 1932, Gov. Franklin D. Roosevelt reaffirmed this commitment,
proclaiming, "Nothing can be more important to a state than its public health;
the state's paramount concern should be the health of its people."
But after World War II, American public health fell victim to its own success.
Thanks to city-planning and sanitation campaigns of the early-20th century and
the antibiotic revolution of the 1940s, fear of infectious disease waned. The
conquest of polio through vaccination in the 1950s delivered the coupe de grace
for public health's middle-class constituency.
Although sexually transmitted diseases, tuberculosis and other infectious
illnesses by no means disappeared and continued to disproportionately afflict
the nation's poor, many in the middle and upper classes believed mankind's
age-old struggle against contagion had ended in triumph. In 1969, the U.S.
surgeon general told Congress as much, concluding that the nation could "close
the books on infectious diseases."
No longer frightened by contagion, middle-class Americans increasingly saw
health as a private matter, looking to high-tech medicine for the next great
advances. Federal spending on medical research surged as state and local
public-health spending ebbed.
As the perceived need for robust
public-health measures diminished, concern about violation of personal autonomy
in the health sphere soared. Revelations of Nazi medical atrocities and reports
that American clinical researchers exposed unknowing people to radiation and
other life-endangering hazards inspired a large shift in medical ethics, toward
patient autonomy as the central principle. The civil rights revolution of the
1960s and 1970s quickened this transformation.
Then came the AIDS epidemic in the 1980s and 1990s. AIDS activists battled
successfully for public-policy responses that intruded minimally on personal
autonomy and privacy. The AIDS paradigm for coping with a public health crisis
treated government as more of a threat than a solution. This civil-libertarian
response to AIDS was of a piece with the individualism and the cult of the
entrepreneur that have flourished in American culture for the past 20 years.
So, what remains in the public health sphere is a profoundly flawed system,
chronically starved of funds, without political support and founded on
antiquated laws. These laws actually thwart decisive public-health action. They
prohibit data-sharing between public health, law enforcement and emergency
management agencies; and they do not provide adequate powers for controlling
property and persons in the event of bioterrorism.
In an era of intercontinental travel, the U.S. is vulnerable to epidemics of
potentially massive proportion. Think about the resurgence of multidrug-resistant
tuberculosis, AIDS and the West Nile Virus. Or think about the prospect of
natural or intentional spread of smallpox or Ebola, both highly contagious and
untreatable. These naturally occurring and terrorist-created threats could
produce mass civilian casualties, straining the public health system far beyond
the current anthrax threat.
There is an urgent need for new federal and state laws to mobilize the needed
resources and to permit, indeed require, information-sharing and other
cooperation among public health, law enforcement and emergency-management
agencies. Our medical technology -- powerful antibiotics, vaccines and the
science base necessary to develop myriad new biological security measures -- is
sufficient to cope with the threats we face. The challenge ahead is a matter of
organization and resources -- and willingness to see the virtues of personal
autonomy against the larger backdrop of the common good.