What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.
SCAM NUMBER THIRTY-SIX: Violate people’s innate process of evaluating and assuming risk in their daily lives by imposing highly risky technology upon them and then falsely underrate the risks that accompany that technology.
In 1953, President Eisenhower initiated the Atoms for Peace Program. A major unstated goal of this initiative was to assuage the terror that had settled into the hearts of a large segment of the population as a result of the development of nuclear weapons. These monstrosities did tremendous violence to people’s sense of personal security and trust in the uninterrupted continuity of life. The possibility of instantaneous demise by forces beyond one’s control was constantly on peoples’ minds. Images of brutal victimization were deeply disturbing to the psychological equilibrium of many. Responding to this unease, the government crafted a propaganda campaign to transform the menacing atom into the beneficent atom. This well-orchestrated crusade was intended to pave the way for the public’s embrace of nuclear power. However, despite the best efforts of government PR, the majority of the population remained wary of the new technology. In people’s minds, nuclear power was incestuously intertwined with nuclear weapons, and the possibility of radiation-induced disease, regardless of how remote, was terrifying. This mindset, an obstacle to the plans of the empowered, became an area of academic interest and was studied by experts in the field of risk analysis.
According to the website of Argonne National Laboratory, risk analysis can be defined as follows:
“The systematic study of uncertainties and potential harm that may be encountered in such areas as the environment, business, engineering, and public policy. Risk denotes a potential negative impact to an asset or some characteristic of value that may arise from some process or future event. Risk analysis seeks to (1) identify the probability of loss, or risk, faced by an institution or business unit; (2) understand how and when risks arise; and (3) estimate the impact of adverse outcomes. Once evaluated, risks can be managed by implementing actions to mitigate or control them.”
Basically, risk analysis is a study of systems. After defining the successful operation of a system, efforts are made to identify the factors that might disrupt the operation of the system, the consequences of this disruption and ways to minimize the likelihood that such disruptions might occur. An adjunct to this study, one relevant to the nascent nuclear industry, was the study of the likelihood of events that would lead to the release of radiation into the environment and what risks such releases might pose to the health of the surrounding population. When the infinitesimally small projections of risk predicted by the nuclear industry failed to quell opposition to nuclear power, social psychologists began investigating risk assessment: how do human beings evaluate risks in their daily lives, and how do they prioritize which risks they are more or less willing to expose themselves to in exchange for the benefits derived?
This field of inquiry delivered novel insights into human behavior. It revealed that all modern human beings share similar criteria for evaluating potential hazards. These patterns of thought act as filters that color people’s perception as to what risks are acceptable or unacceptable. The accompanying table summarizes these perception factors.
Risk Perception
Acceptable Unacceptable
Voluntary Involuntary
Individual Control Others Control
Clear Benefits Unclear Benefits
Trustworthy Sources Untrustworthy Sources
Ethically Neutral Ethically Objectionable
Natural Artificial
Familiar Exotic
No Historical Associations Memorable Associations
Less Dread High Dread
Visible Undetectable
Immediate Effect Delayed Effect
Known, Understood Uncertainty, Variability
Little Media Attention High Media Attention
Copyright 2002 Phil Rutherford. www.philrutherford.com
To illustrate the utility of this material, a simple example will suffice. If asked, a majority of people will say that they feel a lot safer driving their car than flying in an airplane. They cling to this belief even after being apprised of the fact that driving is statistically much more hazardous than flying, and the likelihood of being in a fatal accident on the highway greatly exceeds that of being in one while airborne. That people feel safer in the more dangerous situation is not so enigmatic when the thinking behind such an assessment is understood. People know that driving a car is a risky venture. But while driving, they feel in control of their vehicle and are in a familiar situation. They are on terra firma rather than up above the clouds. This not only produces less dread, but it seems to offer more options in the event that the situation turns unpredictable. Further, car accidents are rather humdrum whereas fiery air crashes make front page news. And bumping into another car and being forced off the round is, on reflection, less fear-inducing than being trapped in a terrifying descent that will surely lead to a fiery crash that burns hundreds of bodies beyond recognition.
The vehement opposition to nuclear power is no mystery once the mechanisms of risk perception are understood. Nuclear power has been imposed on populations throughout the world without recourse to referenda. Due to the complex nature of the technology, others are in control of it, and this increases feelings of vulnerability. The benefits are unclear, given that other methods of generating electricity are available, and the safe disposal of radioactive waste is an unsolvable dilemma. Originating from government and big business, and with a long history of cover-ups, nuclear power is perceived as coming from an untrustworthy source. Being intertwined with the production of nuclear and radiological weapons, the technology is perceived as ethically objectionable. Being extremely high-tech, it is viewed as artificial and exotic. It has memorable associations with Three Mile Island and Chernobyl. The idea of a catastrophic nuclear accident instills extreme dread. Released radiation is undetectable and health effects from exposure are delayed, uncertain, and variable. And reportage in the media of accidents, shutdowns, protests, cost overruns and so forth has furthered people’s suspicions of the technology. In a nutshell, the majority of people, due to their inborn psychological processes that come into play when assessing risks, perceive nuclear power as hazardous and unacceptable.
The nuclear industry found itself behind the eight ball as the process of risk assessment began to be delineated. In response, they devised a brilliant gambit to woo public opinion and make nuclear power appear less risky. The PR strategy they adopted went something like this: The study of risk perception provides concrete evidence that people can be irrational when assessing risks in their daily lives. When assessing alternatives, they give themselves over to emotion and make choices based on fear that are not in their best interest. When a number of people do this simultaneously, they develop unwise social policy that is not for the common good. To prove this, let’s apply statistical analysis to the host of risks people confront daily. When we do this, we discover that nuclear power is less risky than a whole host of risks that people voluntarily assume without hesitation.
This appeal to rationality over what is painted as spontaneous, unreflective prejudice is a highly seductive argument. And it was supported with a number of interesting, and sometimes humorous, observations. For instance, Mettler and Moseley in their book Medical Effects of Ionizing Radiation provide information on how various conditions are statistically associated with lifespan shortening. For instance, a male who remains unmarried can expect for his life to be shortened, on average, by 3,500 days. A male cigarette smoker will lose approximately 2,250 days from his life. Being 30 percent overweight will reduce lifespan by 1,300 days. Having cancer robs its victims, on average, of 980 days, and a stroke diminishes life span by 700 days. Compared with these life-shortening factors, radiation appears downright innocuous. Natural background radiation, according to BEIR 1972, shortens life by eight days, medical x-rays by six days, and reactor accidents between 0.02 and 2.0 days.
Following a different track, Medical Effects of Ionizing Radiation [1] reports the risks confronted in daily life that increase the chance of death by one chance in a million. These include the following:
Activity Cause of Death
Smoking 1 cigarette Cancer, heart disease
Drinking 1/2 liter of wine Cirrhosis of the liver
Living 2 days in New York or Boston Air Pollution
Rock climbing for 1 1/2 minute Accident
Traveling 6 minutes in a canoe Accident
Traveling 10 miles by bicycle Accident
Traveling 30-60 miles by car Accident
Flying 1000 miles by jet Accident
Flying 6000 miles by jet Cancer caused by cosmic radiation
Living 2 months in Denver Cancer caused by cosmic radiation
Being a man age 60 for 20 minutes Illness
Eating 40 tsp of peanut butter Liver cancer caused by alfatoxin B
Eating 100 charcoal-broiled steaks Cancer from benzopyrene
Living 5 years at site boundary of a Cancer caused by radiation
typical nuclear power plant in the open
Living 150 years within 20 miles of a Cancer caused by radiation
nuclear power plant
Risk of accident by living within 5 miles Cancer caused by radiation
of a nuclear reactor for 50 years
This data is intended by the nuclear establishment to awaken people to their inherent foolishness, which by implication is the basis for their resistance to nuclear power. The peanut butter eaters and the bicycle riders are silly to object to nuclear power. In the course of their daily lives, they choose activities that carry similar or greater risks than those posed by nuclear power. The routine pleasures of life put people in jeopardy. If they saw clearly, they could not possibly find fault with nuclear power.
This line of reasoning leads to a disturbing conclusion: basically, people are dummies. They don’t understand themselves, and they don’t understand the world they live in. They must be rescued from their follies by science and rationality. An elite body of enlightened policy makers must arise to lead humanity out of ignorance to a new golden age. This perspective is typified in an article that appeared in the Washington Post, entitled “Let’s Get Real About Risk ” It was written by Daivd Ropeik, director of risk communication at the Harvard Center for Risk Analysis. Although not mentioning nuclear power directly, it could easily be used in its defense. The author begins by illustrating how much human effort is misdirected:
“Hundreds of thousands of Americans will die this year, deaths that can be prevented. Millions will get sick with preventable illnesses. Billions of dollars and countless hours of human effort will be wasted unnecessarily — all because we are afraid of the wrong things.
In a frenzy of fear we are pouring millions this summer into protecting ourselves from the West Nile virus, and spending only a fraction of that sum on public education encouraging people to wash their hands, which would eliminate far more disease transmission than killing every mosquito in America.
Public and private spending on the cleanup of hazardous waste in America is estimated at $30 billion a year. Hazardous waste is a real problem, but the number of people whose health is at risk because of it is actually quite low. Compare that $30 billion with only $500 million a year on programs to reduce smoking, one of the leading preventable causes of death in America”.
After illustrating the folly in current decision-making, the cause of this folly is diagnosed: fear.
“We could make decisions that are more rational and informed. In many areas, science can identify the physical hazards, tell us how many people are likely to be affected by each one, what various mitigations will cost and how effective we can expect them to be. We can rank risks and remedies and put things in perspective. But we don't. Instead, we make policy based more on fear than fact.
Let's be blunt. This irrational response kills people. In a world of finite resources, we can only protect ourselves from so many things. If we overspend on risks such as pesticides or asbestos, which are real but of relatively low magnitude, we have less to spend on greater threats such as bacterial food poisoning or fossil fuel emissions. As a result, thousands of the people exposed to those higher risks will die.
The usual suspects blamed for bad policy are politics, greed, the media, even the open, manipulatable nature of democracy itself. True, these are all factors in a process that often becomes a battle between competing private agendas rather than an informed search for policies that will serve the greatest common good. But the principal underlying cause of wasteful choices that seek protection from the wrong bogeymen is fear.”
Ropeik then identifies how irrational fear, when embraced by large groups of people, can lead to adoption of irrational public policy:
“But society, with limited resources, must be more rational than that. When individual fears become group fears, and when those groups, organized or not, become big enough or visible enough to put pressure on the government to provide protection from less dangerous threats, we can end up with policies that leave a lot of people in the way of harm from higher risks that we're doing less about.”
For the greater good, the solution to this dilemma is deference to the wisdom of a body of independent experts for the rational assessment of societal risks. Ropeik proposes the creation of a nongovernmental agency to “provide us with credible, trustworthy guidance on risks.” His Risk Analysis Institute would rank hazards according to their likelihood and consequences, and oversee cost-benefit analysis to outline possible solutions and maximize resources to protect the greatest number of people. To assure the objectivity of the institute in promoting rational policymaking, the utopian ideal is presented that funding would have no strings attached, and that the “scientific work would have to be carried out by professionals who are chosen for their education and training, their expertise and reputations for integrity, neutrality and open-mindedness, not for who their political friends are.”
Without question, there is tremendous merit in the idea of injecting rationality and objectivity into the process of risk assessment in order to create effective social policies. However, in the hands of an empowered clique such as the Cult of Nuclearists, risk analysis has been transformed, yet again, into a mesmerizing display of smoke and mirrors. As such, it has become a tool to confound the better judgment of people and do violence to their deep-seated impulse to arrange their lives for the maximum degree of safety, security, and tranquility. The attempt to manipulate the perception of the risks posed by nuclear power is readily understood within the context of how this technology initially evolved. Nuclear weapons were imposed on society by government without any form of democratic debate. As the implications of Hiroshima and Nagasaki burrowed deeply into the collective consciousness, people responded appropriately to these weapons from their inborn processes of risk assessment, and by all criteria, judged them to be unacceptable. However, they lacked the political strength to demand limits to the technology or the foresight to realize that, left to its own devices, the Cult of Nuclearists would assemble before everyone’s eyes the arsenal of Armageddon. For a large sector of humanity, the normal process of managing risk was forever upset. They were victimized and traumatized by this reconfiguration of their familiar landscape. Impotent to change this external menace, people’s psychology was forced to undergo modification. They had to integrate into their lives increased feelings of dread and insecurity, fear for the future welfare of their children, anxiety about the precarious fragility of all that made life worth living. People had for the first time in history to face the horrible possibility that the continuity of life into the future might be irrevocably interrupted. When nuclear power appeared on the landscape, these same feelings became associated with the threat of the accidental release of radiation. By the way people normally go about assessing risk, this attitude was not unjustified.
The crux of the problem of nuclear weapons and nuclear reactors is that the Cult of Nuclearists has always prized these technologies above the psychological well-being of the people of the Earth. They introduced a technology that by all measures was inappropriate to human happiness and safety and remained unmoved by the average person’s response to this technology, i.e., that it was unacceptably risky. Rather than respect this instinctive judgment and work to create a new world order more friendly to the inhabitants of the Earth, the Cult of Nuclearists advanced its own agenda. To this end, they fabricated elaborate deceptions to beguile people’s natural inclinations. This was the motivating impulse for much of the lying and deceit revealed within these pages. When the process of human risk perception began to be clarified, proponents of nuclear weapons and reactors manufactured a strategic response for the purpose of demonstrating how misguided human beings can be when relying upon their native instincts for assessing risk. A new social-psychological paradigm was promoted, centered upon the idea that humans are essentially irrational when assessing certain types of risk. To save them from their folly and guide them to seeing the world in its “true” light, social scientists needed to present the risks of daily life in the cold logic of statistical analysis. By this method, humanity could be freed from its “irrational” fears, limited funds could be apportioned more wisely for addressing “real” hazards, more lives could be saved and the greatest good could be achieved for the greatest number.
At the risk of offending the reader, there is no word in the English language that comes close to characterizing this line of reasoning other than “mind-fuck.” It is a cheap trick designed to belittle and invalidate humanity’s collective perception of the nuclear hazard. Rather than admit to the inappropriateness of their technology, inappropriate to the pervasive human desire for safety, security, and a sense of well-being, the Cult of Nuclearists is attempting to beguile humans into accepting that they, the people themselves, are inappropriate to the technology. According to their argument, human nature as it applies to risk assessment is imprisoning the species in fear and shortsightedness, thus holding society back from progress. People are repelled by nuclear technology only because they don’t see the world aright. The cure for this pervasive nuclear phobia is reeducation by the enlightened perspective of “objectivity.” Once this is accomplished, people will awaken to the realization that nuclear power presents no greater risk to their welfare than a short trip in a canoe or a brief ride on a bicycle.
This argument is hogwash. It is based on the fallacy that the perspective of the social scientist and that of the risk-taker are freely interchangeable. Social scientists use statistical analysis as one window on life in their attempt to discern patterns in human behavior. They objectify life in order to study it. They abstract from all the nuances that are involved in individuals formulating preferences of one course of action over another in order to draw certain generalizations about population dynamics. The perspective of the risk-taker, the one who is at risk, is entirely different. For this person, the assessment of risk is a multifaceted process which takes into account past history, knowledge of the world, expectations, preferences, aspirations, intuitions, physical sensations, appetites, emotions, desires, and so forth. If making choices were an entirely rational process and if knowledge of statistics were sufficient to alter behavior, no one would smoke, no one would be overweight and everyone would wear a seat belt. Obviously, this is not the case.
For the moment, let’s assume that the foregoing statistics are accurate and that there is no difference in the risk to life-shortening between eating 40 teaspoons of peanut butter and living for five years at the perimeter of a nuclear power plant. Knowledge of this fact is not sufficient to change most people’s attitude or behavior. Peanut-butter eaters will continue to eat peanut butter with abandon and real estate values around nuclear power plants will remain in a slump. Why? Because personal risk assessment involves more than simply selecting the objectively safest alternative. It involves the very subjective process of creating within oneself a sense of security and safety. All of us are gamblers in the casino of life. We are constantly exposed to a vast matrix of risks, any one of which could ruin or end our lives. To manage this, we push many risks out of our awareness. Others that are more within our control, we may choose to address so as to reduce the hazard they may produce in our life. We choose to better our odds of avoiding certain types of catastrophe by electing to wear seat belts, stop smoking, or go on a diet. However, these efforts offer no complete assurance that we will not die in a car accident, contract lung cancer, or suffer a heart attack. Constant vulnerability to chance and the unexpected is the reality of life. The psychological cushion to this state of precariousness is the sense of security derived from one’s personal process of risk assessment and management regardless of how accurate it may be objectively.
If statistical knowledge of relative risk lacks the power to supplant most people’s inborn processes of risk assessment, what alternative remains for those intent on creating social policy at odds with the public’s perceptions? The only option is to circumvent these perceptions by ignoring and overriding them. This is the ultimate purpose of the proposed Risk Analysis Institute. “Experts” are to be enlisted to apply their “superior” wisdom and purported “objectivity” to contravene what is characterized as the passions and ignorance of the masses. Sidestepping the annoying pitfalls of having to deal with public opinion, these experts will work directly with policy makers and lawmakers to create a society reflecting their own values and interests. Undisguised, this is social engineering of a new world order by an elite class not accountable to the people who will have to live under the social policies imposed upon them without consensus. The Risk Analysis Institute is a utopian ideal fraught with peril for humanity. This is most clearly illustrated by a living embodiment of such an organization, the ICRP. To all external appearances, this body of experts provides lawmakers throughout the world with objective information on radiation risk. But as we have revealed, behind their facade of purported objectivity, this organization is a whore of the Cult of Nuclearists, bolstering and legitimizing its misdeeds while ruining the health of untold numbers of victims.
The safety of commercial nuclear power plants is a subject besmeared with obfuscation. Consequently, a legitimate avenue of investigation is to question how the statistics of relative risk listed above were derived. If they are based on the “presumed” dosages to the population from the “assumed” levels of radioactive effluents routinely vented into the environment, and if the risk factors of the ICRP are then applied to these dosages, it should be apparent by this point in the discourse that the hazard to health will be greatly understated. As we shall explore in the following chapter, if the statistics of risk are based on the casualty data recorded in the US Radiation Accident Registry, the conclusions reached as to the hazards of nuclear power plants will be nothing less than a mockery to intelligence. Not everyone in the population is equally at risk from discharges of radioactive pollutants from nuclear installations. If the total amount of radiation released into the environment is treated as if it were distributed to the entire population, the presumed risk is vastly underrated. It would be more accurate to examine the risk incurred by those individuals living immediately downwind of nuclear power plants. As will be revealed in Exhibit F, this type of investigation will reveal elevated risks of breast cancer to people living downwind compared to those living upwind of these facilities. Inclusion of this data in comparisons of relative risks would forever tarnish the myth of the harmlessness of nuclear power.
Statistics can be easily manipulated to create this or that false impression. For instance, statistical analysis may be used to demonstrate the low risk of life-shortening posed by nuclear weapons, the improbability of another Chernobyl-type accident, or the minuscule hazard posed by the planet’s accumulated radioactive waste. But such fine number-crunching would be brought to naught by the single improbable occurrence of a nuclear war that decimated 90 percent of the population of the Earth. The safety record of commercial nuclear reactors can be touted ad infinitum, but the low-probability event of a simultaneous breach of containment and loss of coolant to a reactor core would contaminate the entire population of a large metropolis. Stored nuclear waste has yet to cause catastrophic loss of life, but the safety record of today may fail to account for hazards facing an unsuspecting humanity thousands of years in the future. Mathematical probabilities may predict that nuclear accidents are far-fetched and unlikely, but far-fetched and unlikely things happen all the time. The question is not how improbable the risk, but whether or not we can afford to have such a risk in our midst at all. Is the technology worth the risk of the mass casualties that seem so implausible? Rather than go to the statistical tables for answers, we should travel to Belarus and ask people there if another Chernobyl is worth the risk. We should travel to Hiroshima and ask the survivors what they think of America’s defense policy. We should ask sick veterans returning from Iraq whether they think the ICRP’s risk factors for inhaled uranium are accurate.
Contrary to the beliefs of the pundits within the Cult of Nuclearists, the people of the Earth are not dummies. We recognize the lies for what they are. We are acutely aware that we are living on the brink of nuclear catastrophe. We have witnessed calamitous radiation accidents, and there is nothing that can convince us that these will not happen yet again. Where before we used to protect our children from discovering about the birds and bees, today the horrific secret to be kept from tender ears is that their lives can be incinerated in a microsecond by some deluded idiot. We long to live in a world where we can ride bikes and go canoeing and eat peanut butter sandwiches without being burdened by the thought of having our world ruined by radioactive contamination. We recognize that the Cult of Nuclearists and their policies are an embedded cancer in the body politic. Excising them from our midst may be treacherous because such an operation may kill the host as well. But the people do not have unlimited patience with threat, injustice and deception. Let us hope that Cult of Nuclearists will quantify that risk as well.
Bibliography
[1] Mettler F.A., Moseley R.D. Medical Effects of Ionizing Radiation. Orlando: Grune and Stratton Inc.; 1985.
[2] Ropeik D. Let’s Get Real About Risk. Washington Post. August 6, 2000. http://www.washingtonpost.com/ac2/wp-dyn?pagename=article&contentId=A41017-2000Aug5¬Found=true