04 May 2003
Source: Los Angeles Times, May 4, 2003
Researchers who aid terror regimes should face the possibility of war crimes trials
By M. Gregg Bloche
M. Gregg Bloche is a professor of law at Georgetown University, an adjunct professor at Johns Hopkins University's Bloomberg School of Public Health, and a member of the Committee on Scientific Freedo
WASHINGTON — Six hundred years ago, invading Tatars intent on controlling Silk Road trade attacked the Black Sea port of Kaffa in unconventional fashion: They catapulted dead human bodies, victims of bubonic plague, over the town's walls. Residents of Kaffa came down with the disease. Several townspeople fled by sea, and the Mediterranean cities that accepted them suffered devastating plague outbreaks. Some speculate that the Black Death, which killed nearly one-third of Europe's population, was the product of these outbreaks — and, perhaps, a product of the Tatars' biological attack.
At least since Roman times, invading armies have launched dead animals over city walls or dumped them into water supplies to spread disease. Medieval warlords lofted anthrax-infected beasts into the town of Les Baux-de-Provence, in France's Rhone Valley wine country. In 1763, a British army captain, Sir Jeffrey Amherst, approved of the idea of using smallpox-infected blankets to "reduce" the numbers of American Indians. Long before naturalists came up with the idea that germs cause disease, people figured out how to spread illnesses intentionally, with devastating effect.
History indicates that researchers are seldom reluctant to use what they know to make warfare more lethal. Yet, one of Saddam Hussein's leading scientists claimed a week ago that he came up with a way to transform liquid anthrax into a more potent and durable powder but didn't try it. "I kept the method secret," Nissar Hindawi told the New York Times. "History would have cursed me." Hindawi, who admits lying to U.N. inspectors to cover up his bioweapons work, says his decision prevented Iraq from producing powdered anthrax, the form used to spread terror through the U.S. mail a year and a half ago. But skeptical U.N. experts say other Iraqi scientists knew how to do so, and that Iraq imported ovens to dry liquefied anthrax spores.
We may soon know more. The arrests of several of Hussein's scientists and the manhunt for more could resolve long-standing controversies over Iraq's rogue-weapons capabilities. These arrests also raise the question of researchers' culpability for the destructive forces science can create.
No scientist has ever been convicted of a war crime because his or her research made possible the production of a terrible weapon. Concerns about scientific freedom have stood in the way. Since the Nazi war crimes trials, it has been unlawful to knowingly collude in the production of weapons for forbidden use. German industrialists with technical training were tried and convicted for manufacturing Zyklon B, the gas that killed millions in Adolf Hitler's death camps. But the research that created Zyklon B did not lead to criminal convictions.
Nazi doctors who experimented with lethal agents and techniques faced judgment at Nuremberg for their treatment of the people they used as guinea pigs. Yet, neither transnational law nor the ethics that govern science speaks to the question of researchers' accountability for others' illicit uses of their work. The ongoing roundup of Hussein's scientists presents an opportunity to create worldwide norms of legal and ethical responsibility. It is urgent that we do so. Science in rogue states poses unprecedented dangers, and scientists, by saying "no," are in the best position to avert them.
For the research community, such accountability is an awkward matter. Scientists celebrate the freedom to pursue truth without regard for social consequences. In the words of Robert Oppenheimer, the physicist who led America's crash program to build an atomic bomb: "If you are a scientist, you believe that it is good to find out how the world works; that it is good to find out what the realities are, that it is good to turn over to mankind at large the greatest possible power to control the world and to deal with it according to its lights and values." Oppenheimer justified his efforts on these grounds, figuring that political judgments about the bomb's use were for others to make — according to their "lights and values." When, in the 1950s, Albert Einstein and other scientists called for the abolition of atomic weapons, they stopped short of urging a research ethic of restraint when science has potentially devastating applications.
We can no longer afford a scientific ethic that disregards the social consequences of research. Some of the consequences are just too scary. The Manhattan Project mobilized people and resources on a massive scale in a constitutional democracy. Even amid wartime secrecy, accountability and restraint were built in. Scientists could be confident that civilian leaders were attuned to democratic values (and electoral pressures) and would use destructive technologies accordingly.
Not only do rogue regimes utterly lack constitutional mechanisms of restraint; scientific means of mass slaughter can now be devised with modest resources behind the proverbial garage door, out of reach of political accountability. They can be stored and moved stealthily, without international detection, as Hussein's regime has shown. A few microbiologists who follow clandestine orders — and defer to the "lights and values" of thugs who rule failed states — take chances with millions of lives.
Bush administration officials concerned about rogue uses of science have focused on the fear that cutting-edge knowledge from U.S. labs could find its way to caves near Kandahar or bunkers beneath Baghdad. Accordingly, they have tried to dam the flow of scientific information by making it harder for foreign graduate students to get visas, asking scientific journal editors not to publish "sensitive" findings and barring foreign researchers from scientific meetings.
These intrusions on academic freedom make little sense. They target the intellectual exchange that has powered U.S. science to preeminence. Most of the graduate students in physical-science disciplines that drive our high-tech economy come to America from abroad. Many stay. They take jobs in corporate and university labs, and they contribute to our prosperity.
The real threats from rogue science are in technologies far from the cutting edge. Widely available materials and methods, the chemical or biological equivalent of crashing jetliners into buildings, enable individuals with basic scientific training to concoct recipes for mass murder. The notion that tunnels and bunkers in failed states shelter world-class science is the stuff of James Bond fantasy. Restrictions on international scientific exchange won't affect what happens in these tunnels and bunkers.
What will? Since we can't count on rogue despots' "lights and values," or on the ability of U.S. intelligence to monitor every hideaway, we should look for ways to persuade potential rogue-science perpetrators to act responsibly. We can do this without constraining scientific freedom.
We should start by holding scientists criminally responsible for rogue recipes — and for work done with intent to produce them. International law criminalizes conduct that abets violation of the rules of war if the abettors act purposefully or knowingly. Extending this accountability to research that yields banned weapons would send a strong message to the scientific community.
Criminal convictions of scientists on these grounds would be difficult to obtain, even when evidence proves that research resulted in banned weapons. Defendants can claim that their purpose was to prepare countermeasures against illegal weapons, or that they didn't know their findings and methods would be misused.
But even if no scientists were convicted, much would be gained by accountability in principle. Researchers would be on notice that freedom of inquiry doesn't cover work meant to create illicit destructive power. In time, this duty to desist could take root as an ethical norm. There is precedent for such progress. In 1947, the Nuremberg tribunal that convicted Nazi doctors for their gruesome experiments held that research on people could not proceed without the subjects' informed consent. Until then, medical researchers didn't typically treat consent as an ethical requirement. By the 1970s, informed consent had become ethically routine.
For scientists, saying "no" to rogue purposes needs to become a matter of moral urgency. Criminal accountability is a starting point, but the duty to desist must develop into a shared professional commitment. No transnational legal regime can substitute for the spreading belief that, in Hindawi's words, "History would have cursed me."