← Return to search results
Back to Prindle Institute

The Ethics of Scientific Advice: Lessons from “Chernobyl”

photograph of Fireman's Monument at Cherynobl

The recently-released HBO miniseries Chernobyl highlights several important moral issues that are worth discussing. For example, what should we think about nuclear power in the age of climate change? What can disasters tell us about government accountability and the dangers of keeping unwelcome news from the public? This article will focus on the ethical issues concerning scientists potential to influence government policy. How should scientists advise governments, and who holds them accountable for their advice? 

In the second episode, the Soviet Union begins dumping thousands of tons of sand and boron onto the burning nuclear plant at the suggestion of physicist Valery Legasov. After consulting fellow scientist Ulana Khomyuk (a fictional character who represents the many other scientists involved), Legasov tells Soviet-leader Gorbachev that in order to prevent a potential disaster, drainage pools will need to be emptied from within the plant in an almost certain suicide mission. “We’re asking for your permission to kill three men,” Legasov reports to the Soviet government. It’s hard to imagine a more direct example of a scientist advising a decision with moral implications. 

Policy makers often lack the expertise to make informed decisions, and this provides an opportunity for scientists to influence policy. But should scientists consider ethical or policy considerations when offering advice? 

On one side of this debate are those who argue that scientists primary responsibility is to ensure the integrity of science. This means that scientists should maintain objectivity and should not allow their personal moral or religious convictions to influence their conclusions. It also means that the public should see science as an objective and non-political affair. In essence, science must be value-free.

This value-free side of the debate is reflected in the mini-series’ first episode. It ends with physicist Legasov getting a phone call from Soviet minister Boris Shcherbina telling him that he will be on the commission investigating the accident. When Legasov begins to suggest an evacuation, Shcherbina tells him, “You’re on this committee to answer direct questions about the function of an RBMK reactor…nothing else. Certainly not policy.”

Those who argue for value-free science often argue that scientists have no business trying to influence policy. In democratic nations this is seen as particularly important since policy makers are accountable to voters while scientists are not. If scientists are using ethical judgments to suggest courses of action, then what mechanism will ensure that those value judgments reflect the public’s values?

In order to maintain the value-free status of science, philosophers such as Ronald N. Geire argue that there is an important distinction between judging the truth of scientific hypotheses and judging the practical uses of science. A scientist can evaluate the evidence for a theory or hypotheses, but they shouldn’t evaluate whether one should rely on that theory or hypothesis to make a policy decision. For example, a scientist might tell the government how much radiation is being released and how far it will spread, but they should not advise something like an evacuation. Once the government is informed of relevant details, the decision of how to respond should be left entirely to elected officials. 

Opponents of this view, however, argue that scientists do have a moral responsibility when offering advice to policy makers and believe that scientists shouldering this responsibility is desirable. Philosopher Heather Douglas argues that given that scientists can be wrong, and given that acting on incorrect information can lead to morally important consequences, scientists do have a moral duty concerning the advice they offer to policy makers. Scientists are the only ones who can fully appreciate the potential implications of their work. 

In the mini-series we see several examples where only the scientists fully appreciate the risks and dangers from radiation, and are the strongest advocates of evacuation. In reality, Legasov and a number of other scientists offered advice on how to proceed with cleaning up the disaster. According to Adam Higginbotham’s Midnight in Chernobyl: The Untold Story of the World’s Greatest Nuclear Disaster, the politicians were ignorant of nuclear physics, and the scientists and technicians were too paralyzed by indecision to commit to a solution.

In the real-life disaster, the scientists involved were frequently unsure about what was actually happening. They had to estimate how fast various parts of the core might burn and whether different radioactive elements would be released into the air. Reactor specialist Konstantin Fedulenko was worried that the boron drops were having limited effect and that each drop was hurling radioactive particles into the atmosphere. Legasov disagreed and told him that it was too late to change course. Fedulenko believed it was best to let the graphite fire burn itself out, but Legasov retorted, “People won’t understand if we do nothing…We have to be seen to be doing something.” This suggests that the scientists were not simply offering technical advice but were making judgments based on additional value and policy considerations. 

Again, according to Douglas, given the possibility for error and the potential moral consequences at play, scientists should consider these consequences to determine how much evidence is enough to say that a hypothesis is true or to advise a particular course of action. 

In the mini-series, the government relies on monitors showing a low level of radiation to initially conclude that the situation is not bad enough to warrant an evacuation. However, it is pointed out the radiation monitors being used likely only had a limited maximum range, and so the radiation could be much higher than the monitor would tell them. Given that they may be wrong about the actual amount of radiation and the threat to public health, a morally-responsible scientist might conclude that evacuation be suggested to policy makers. 

While some claim that scientists shouldn’t include these considerations, others argue that they should. Certainly, the issue isn’t limited to nuclear disasters either. Cases ranging from climate change to food safety, chemical and drug trials, economic policies, and even the development of weapons, all present a wide array of potential moral consequences that might be considered when offering scientific advice. 

It’s difficult to say a scientist shouldn’t make morally relevant consequences plain to policy makers. It often appears beneficial, and it sometimes seems unavoidable. But this liberty requires scientists to practice judgment in determining what a morally relevant consequence is and is not. Further, if scientists rely on value judgments when advising government policy, how are scientists to be held accountable by the public? Given these benefits and concerns, whether we want scientists to make such judgments and to what extent their advice should reflect those judgments presents an important ethical dilemma for the public at large. Resolving this dilemma will at least require that we be more aware of how experts provide policy advice.