Both the great Truths and the great Falsehoods of the twentieth century lie hidden in the arcane, widely inaccessible, and seemingly mundane domain of the radiation sciences

Thursday, October 28, 2010

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

Under normal circumstances, thyroid cancer is a rare occurrence. After the core of the Chernobyl reactor became scattered to the winds, however, an epidemic of thyroid cancer among children and teenagers broke out in the most affected Soviet territories. For example, Stsjazhko et al. reported in 1995 on the officially validated rate of thyroid cancer in Belarus in the under-15 age group before and after the accident (2.8 million children fell within this group out of a total population of 9.9 million.) In the years 1981 to 1985, approximately 3 cases of thyroid cancer existed per million children. In the years 1986 to 1990, the number of thyroid cancers had increased to 47 per million — 17 times the pre-accident level. Between 1991 to 1994, 286 cases per million were validated — 102 times greater than before the accident.

This epidemic was beyond the purview of the ICRP risk models. Ignoring the tragedy of the Marshall Islanders, the prevailing view of the cancer-causing potential of internalized iodine-131 was given clear expression in UNSCEAR 1988. The authors stated that their literature review provided “little proof that iodine-131 is carcinogenic in humans and support[ed] the notion that the carcinogenic potential of I-131 beta particles might be as low as four times less than external x-rays or gamma rays” [1]. Here in a nutshell is expression of the corrupted paradigm of radiation effects: external radiation is more hazardous than internal contamination and the risk to health is extremely diminished if the exposure is chronic rather than acute. According to the ECRR, these two errors were demolished by the high rate of thyroid cancer after Chernobyl. First, internal contamination, not external irradiation, caused the runaway epidemic. Second, chronic low-dose exposure from radionuclides in the environment was the method of delivery. Further, Chernobyl refuted the prevalent idea that a latency period of 10 years or more was required between thyroid exposure and the onset of clinical symptoms. After the Chernobyl explosion, increases in the rate of thyroid cancer became observable within a few years.

To fit the skyrocketing incidence of thyroid cancer to their incorrect models, the radiation protection agencies attempted to massage their data:

“The risk agency community, having had to swallow the facts of the increase, promptly responded by adjusting the doses to as high a level as possible to try and fit the data to the model. The idea was to assume that the children who were affected had been iodine-deficient and therefore their thyroid glands would take up more iodine. This was unsuccessful since doses large enough to fit the cancer data would be so high that the children would have died of radiation sickness” [2].

In his book Wings of Death, Chris Busby provides an excellent example of the type of shenanigans that can infiltrate the field of radiation protection. It is mentioned here because it bears on the accepted risk factor for thyroid cancer and the reason for the inaccurate predictions made for this endpoint in the wake of Chernobyl. Both BEIR V and UNSCEAR 1988 cite a study by Lars-Erik Holm and colleagues on iodine-131 induced thyroid cancer. (The UNSCEAR document referred to the study as “important evidence.” Lost to many in the fine print was the fact that Holm was one of the authors of UNSCEAR 1988.) The development of the accepted risk factor for thyroid cancer relied heavily on this study. Holm et al. conducted research on a population of 35,000 patients, who between 1951 and 1969 had undergone diagnostic procedures that involved injections of iodine-131. In determining the incidence of radioiodine induced thyroid cancer, the authors made a scientifically questionable procedural decision. They discarded from consideration all cases of thyroid cancer that had been diagnosed within five years of the I-131 injections. They justified this extraordinary step on the basis of the Hiroshima Life Span Study, that claimed that a considerable time elapsed between exposure and the clinical expression of thyroid cancer. Assuming the truth of this observation to be applicable to all avenues of exposure, the authors concluded that cancers diagnosed within five years of exposure could not be reliably attributed to the radioiodine injections. They proceeded on the unwarranted premise that these cancers were present prior to the injections but had gone undiagnosed. From a study of the control population, the authors calculated that in a population of 35,000 the expected number of thyroid cancers would be 39.4. After discarding the questionable cancers appearing within five years of injection, 50 cancers were recorded in the study group. This number was not statistically significant when compared to the control population, and the conclusion the authors arrived at was that the internalized iodine-131 had no effect on the incidence of thyroid cancer. How many cases of thyroid cancer did they need to throw out to reach this conclusion? As Busby reports: “Careful analysis of the paper reveals that 156 extra cancers developed in the group in the first five years but that these were discarded. The true result should have been 156 + 50 = 206 cancers, or five times the control group incidence [3]” [4].

By this time, the reader needs little coaching to discern the scam being enacted, perhaps unwittingly, by scientists enmeshed in the bastardized system of radiation effects. Holm and colleagues ground key ideas of their research on the corrupted Hiroshima data of acute, external irradiation that purportedly “proved” that thyroid cancer requires a long latency period. They then imported this “fact” into a study of internal contamination by iodine-131 and used it to justify throwing out 156 cancers from their study group. This permitted them to reach the conclusion that internalized radioiodine does not contribute to excess thyroid cancers. At this point, the radiation protection agencies step in and use this “important evidence” to establish risk factors for iodine-131. In the event of a radiation accident that vents radioiodine into the environment, the radiation protection agencies can refer to what by this time has gained the stature of a canon, in order to bamboozle the population into believing that the public health impact will be much less severe than what actually transpires. In the event that anyone questions the accuracy of these authoritative assessments, they will be referred to the mind-numbing Gordian Knot of indecipherable journal articles, cryptic mathematical models, and unconquerable decrees of the ICRP: the modus operandi of a near-perfect crime. One can only marvel at the sophistication of this debauched edifice, which masterfully conceals mass casualties and death delivered to the people of the Earth by the Cult of Nuclearists.

The severity of the Chernobyl accident caused this corruption to be unveiled. Using data from Belarus that was reported in UNSCEAR 2000, the ECRR calculated that the error in the risk factors of the ICRP for thyroid cancer was about six-fold or more. In confirmation of this conclusion, Wings of Death contains the following observation:

“It is clear, nevertheless, that a major error exists in the accepted risk for thyroid cancer. There are already 450 cancers in the first 10 years for the under-14 age group alone in the areas into which the evacuees [from Belarus] were moved. Only 100 excess thyroid cancers were predicted for all age groups combined in this population for the next 50 years. Thyroid cancer has also increased in adults. In 1993 there were 2,039 registered cases in Belarus (population 10.5 million) and more than 3,000 in the Ukraine (population 53 million) (BMJ, 1993). At minimum the error defined by this is already several hundred per cent; at maximum it is truly enormous, since only 10 years have passed out of the 40 years covered by the prediction. The trend is upward: this error will grow. These predictions were made on the basis of the existing risk factors, so their inaccuracy, already apparent and no doubt to become more obvious over the coming years, indicates that the risk-factor calculations for thyroid cancer, like those for leukemia, are unreliable. Chernobyl represents the most important recent test of these risk factors; it has proved that they are in urgent need of revision”.


[1] United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). Sources, Effects and Risks of Ionizing Radiation. Report to the General Assembly, New York 1988.

[2] European Committee on Radiation Risk (ECRR). Recommendations of the European Committee on Radiation Risk: the Health Effects of Ionising Radiation Exposure at Low Doses for Radiation Protection Purposes. Regulators' Edition. Brussels; 2003.

[3] Gofman J.W. Radiation-Induced Cancer from Low-Dose Exposure: An Independent Analysis. San Francisco: Committee for Nuclear Responsibility; 1990.

[4] Busby C. Wings of Death: Nuclear Pollution and Human Health. Aberystwyth, Wales: Green Audit Books, Green Audit (Wales) Ltd; 1995.

Monday, October 25, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F continued

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

Observed health effects after Chernobyl have provided further evidence that the ICRP models are in error. For instance, research conducted in Sweden confirmed a 30% increase in the incidence of cancer between 1988 and 1996 as a result of the fallout from Chernobyl [1]. In this study, dosages to the population were estimated on the basis of the deposition of cesium-137 in 450 parishes in northern Sweden and cancer rates were recorded for the 1,143,182 residents of the area. The 22,409 cases of cancer that were diagnosed during the nine-year study period presented an excess of 849 cases compared to what was predicted by ICRP models. According to analysis conducted by the Low Level Radiation Campaign, these excess cancers are 125 times the incidence predicted by the ICRP on the basis of the cesium doses. Due to the fact that this study was concluded nine years after the accident, LLRC warns that, due to the long latency period prior to the onset of cancer, future diagnoses are likely to demonstrate even greater error in ICRP models. If the observed effect up to 1996 is representative of the distribution of increased cancer risk throughout the lifetime of the study population, cancer incidence may prove to be more than 600 times that predicted by the ICRP. The LLRC has offered a further interesting observation about the Tondel study:

“The dose response trend calculated by Tondel on the basis of the various levels of cesium deposition is biphasic, not linear. In other words it does not conform with the ICRP dogma that dose and effect are always strictly proportional or “linear.” The Tondel study does not show twice as much dose causing twice as much cancer.

The doses given by Tondel et al. are calculated from cesium fallout. This may mean nothing since cesium is a gamma emitter which means that its energy deposition (in the form of ionizations) is spatially well distributed in tissue. It is, moreover, soluble and does not form particles. Its health effects are therefore likely to conform with the external irradiation models. However, it is well known that north Sweden received a large amount of fallout in the form of uranium fuel particles. With diameters of less than a few millionths of a meter such particles are highly mobile in the environment and they can be inhaled or swallowed. Once embedded in body tissue they deliver their energy so locally that the few cells immediately next to them are irradiated at very high energies while the rest of the body gets no dose at all. This makes nonsense of the concept of “average dose” – another establishment dogma” [2].


[1] Tondel M., Hjalmarsson P., Hardell L., Carlsson G., Axelson O. Increase of Regional Total Cancer Incidence in North Sweden due to the Chernobyl Accident? Journal of Epidemiology and Community Health. 2004; 58:1011-1016.

[2] Bramhall R. E-mail Circular from the Low Level Radiation Campaign: New Chernobyl Effects Falsify Radiation Risk Model. November 26, 2004.

Thursday, October 21, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F continued

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

This new body of data on minisatellite mutations provides unequivocal evidence that radiation in the environment can induce alterations in the germ cells of human beings that can then be transmitted to offspring. Due to current limitations in research techniques, the analysis of changes in the mutation rate of other parts of the human genome has not yet been performed. The big question that remains to be answered is how frequently transmittable mutations occur in protein-coding segments of DNA and how often heritable diseases result from these mutations. Here, a whole new field of inquiry lies waiting to be explored. We must remove the blinders to our vision produced by the corrupted Hiroshima study. What has been considered in the past as hereditary disease might in fact be radiation damage to the germ cells of the parents, producing an array of chronic diseases in the next generation. At this point in history, we have no idea what portion of the inheritable diseases suffered by our progeny is being created by the radiation we have scattered throughout the biosphere. The CERRIE Minority Report offers this cautionary note:

The question before the Committee is not whether such changes occur [minisatellite mutations] but whether they are associated with significant health detriment. In our view, repeat sequence mutations of various types have been associated with recognizable effects in humans, including neurological disorders, mental retardation, malformations, spontaneous abortion, epilepsy, diabetes, and cancers” [1].


[1] CERRIE Minority Report. Minority Report of the UK Department of Health / Department of Environment (DEFRA) Committee Examining Radiation Risk from Internal Emitters (CERRIE). Aberystwyth: Sosiumi Press; 2005.

Monday, October 18, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F continued

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

In addition to the research conducted on infant leukemia induced by Chernobyl fallout, the ECRR has identified a second body of research that unequivocally confirms that major shortcomings exist in the ICRP model of radiation effects. Again as a result of radiation vented from Chernobyl, data has been collected that proves elevated rates of minisatellite DNA mutations among exposed groups. Minisatellites are identical short segments of DNA that repeat over and over again in a long array along a chromosome. These stretches of DNA do not code for the formation of any protein. What distinguishes these minisatellites is that they acquire spontaneous repeats through mutation at a known rate, which is 1,000 times higher than normal protein-coding genes. Dr. Yuri Dubrova, currently at the University of Leicester, first realized that these stretches of DNA could be used to detect radiation-induced genetic mutations by showing that their known rate of mutation had increased subsequent to exposure. By this technique, only small population samples would be required to detect a trend in the rate of radiation-induced mutations. The accuracy of this methodology was first confirmed by Dr. Dubrova in mice. He then set out to investigate radiation-induced mutation in the human germ line — sperm and egg cells — among groups receiving exposure to Chernobyl fallout. That such mutation occurred in fruit flies and mice which was then passed on to their offspring had been known since the 1920s. That the same phenomenon occurred in humans had yet to be proven. Human germ line DNA is well protected against acquiring mutations. Most damage is immediately repaired. Irreparable damage frequently initiates cell death so that mutations are prevented from being passed on to the next generation. As a consequence, germ line mutations are rarely detected. The children of the atomic bomb survivors in Hiroshima and Nagasaki provided no evidence of any significant difference in mutation rates when compared to control groups.

Dr. Dubrova and his colleagues [1,2] studied the rate of minisatellite mutations in families that had lived in the heavily polluted rural areas of the Mogilev district of Belarus after the Chernobyl meltdown. They found the frequency of mutations being passed on by males to their descendants was nearly twice as high in the exposed families compared to the control group families. Among those exposed, the mutation rate was significantly greater in families with a higher parental dose. This finding was consistent with the hypothesis that radiation had induced the germ line mutations. It was the first conclusive proof that radiation produced inheritable germ line mutations in humans. The significance of this line of research was further confirmed by research in Belarus on the germ line mutations induced by Chernobyl fallout in barn swallows [3]. Minisatellite mutations were observed and were accompanied by observable phenotypic alterations in plumage patterns as well as reduced rates of survival.

In 2002, Dr. Dubrova published further research [4] in the journal Science concerning genetic mutation in populations exposed to fallout from atmospheric weapon testing. Between 1949 and 1956, the Soviet Union had detonated a series of aboveground atomic tests at the Semipalatinsk nuclear facility in Kazakhstan. The local population suffered significant radiation exposure throughout this period. The team led by Dr. Dubrova analyzed blood samples from three generations of about 40 families dwelling in the rural district of Beskaragai. They discovered a nearly 80-percent increase in the mutation rate in individuals directly exposed to the fallout in comparison with a suitable non-irradiated control population. The children of affected individuals evidenced a 50-percent increase in minisatellite mutations when compared to the children of non-irradiated parents. After the 1950s, when the practice of atmospheric weapon testing came to an end, the rates of mutation steadily declined.

Minisatellite DNA testing has also been performed on the children of Chernobyl “liquidators” i.e., those people who participated in post-accident cleanup operations. When the offspring of liquidators born after the accident were compared to their siblings born prior to the accident, a sevenfold increase in genetic damage was observed [5]. As reported by the ECRR, “for the loci measured, this finding defined an error of between 700-fold and 2,000-fold in the ICRP model for heritable genetic damage.” The ECRR made this further observation:

It is remarkable that studies of the children of those exposed to external radiation at Hiroshima show little or no such effect, suggesting a fundamental difference in mechanism between the exposures [Satoh and Kodaira 1996.] The most likely difference is that it was the internal exposure to the Chernobyl liquidators that caused the effects” [6].


[1] Dubrova Y.E., et al. Human Minisatellite Mutation Rate after the Chernobyl Accident. Nature. 1996; 380:683-686 .

[2] Dubrova Y.E., Nesterov V.N., Jeffreys A.J., et al. Further Evidence for Elevated Human Minisatellite Mutation Rate in Belarus Eight Years After the Chernobyl Accident. Mutation Research. 1997; 381:267-278.

[3] Ellegren H., Lindgren G., Primmer C.R., Moeller A.P. Fitness Loss and Germline Mutations in Barn Swallows Breeding in Chernobyl. Nature. 1997; 389(9):583-584.

[4] Dubrova Y. E., et al. Nuclear Weapons Tests and Human Germline Mutation Rate. Science. 2002; 295:1037.

[5] Weinberg H.S., Korol A.B., Kiezhner V.M., Avavivi A., Fahima T., Nevo E., Shapiro S., Rennert G., Piatak O., Stepanova E.I., Skarskaja E. Very High Mutation Rate in Offspring of Chernobyl Accident Liquidators. Proceedings of the Royal Society London. 2001; D, 266:1001-1005.

[6] European Committee on Radiation Risk (ECRR). Recommendations of the European Committee on Radiation Risk: the Health Effects of Ionising Radiation Exposure at Low Doses for Radiation Protection Purposes. Regulators' Edition. Brussels; 2003.

Thursday, October 14, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F continued

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

The significance of the infant leukemia clusters in the wake of the Chernobyl accident must not be lost on the reader. Radiation was delivered to developing fetuses through their mothers breathing and eating radionuclides that were released thousands of miles away. Levels of radiation in the environment where these women lived, declared by the radiation protection community as being below regulatory concern, adversely affected the development of their babies. This evidence definitively demonstrates that, at least for infant leukemia, the ICRP model is wrong. This model, based on instances of acute, high-dose exposure to external radiation fails to adequately account for illness induced by chronic low-dose exposure from decaying radioisotopes lodged within the human body’s interior. The fact that the frequency of childhood leukemia occurred at a rate greater than predicted by the risk estimates derived from the ICRP model signifies that populations are incurring more illness from low-level radiation in the environment than the radiation protection community wants us to believe. An important corollary of this conclusion must never be forgotten. In the aftermath of many radiation releases, good epidemiological evidence is not always available. Consequently, the radiation protection agencies assess the impact to public health by turning to their models and allowing their models to inform the public of the cost they are paying in eroded health and death. When these models are flawed, they serve to cover up the true incidence of radiation-induced illnesses foisted on the population. Corrupted science becomes an accessory to murder. This is the fraud for which a guilty verdict is being sought against the Cult of Nuclearists.

The post-Chernobyl infant leukemia cohorts provide evidence that developing fetuses incur genetic damage from low-level radiation from internal emitters absorbed by their mothers. Although not proven, this evidence suggests that other types of genetic illnesses may likewise be traced to exposure in the womb to levels of internal emitters currently deemed inconsequential. In support of this hypothesis, Busby and Scott Cato cite evidence of other in utero effects in the immediate aftermath of Chernobyl. Data obtained from the UK Office of Population Census and Surveys provides evidence of babies with a very low birth weight — less than 1,500 g (approximately 3.3 pounds) — born in Wales just after the accident. These births peaked between January 1987 and January 1988. (The Chernobyl accident occurred on April 26, 1986.) This evidence gives further credence to studies that demonstrated increased levels of infant mortality following exposure to fallout during the period of atmospheric weapon testing. In the light of these findings, it is essential to recall that, according to ICRP models, the radiation released into the environment by humans has not been responsible for producing any fetal deaths, still births, or death to infants. As the ECRR notes:

The ICRP only considers heritable effects which are measurable in phenotype after birth e.g. congenital defects and perhaps increases in clinically diagnosed heritable genetic diseases. Thus fetal death and infant mortality are not addressed as radiation exposure outcomes by ICRP.”


[1] Busby C, Scott Cato M. Increases in Leukemia in Infants in Wales and Scotland Following Chernobyl: Evidence for Errors in Statutory Risk Estimates. Energy and Environment. 2000; 11(2):127-139.

Monday, October 11, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F continued

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

EXHIBIT F continued:

The infant leukemia produced by Chernobyl confirms that radioactive pollutants are the likely cause of childhood leukemia reported in the vicinity of Sellafield and of the other main sources of radioisotope pollution in Europe. Gardner et al. [1] have confirmed a 10-fold increase in childhood leukemia near Sellafield. In proximity to the Dounreay reprocessing plant in Scotland, an eight-fold excess has been observed [2]. A 15-fold excess in childhood leukemia has been reported near La Hague in France [3,4]. Near the nuclear facility of Harwell in Oxfordshire and the Atomic Weapons Establishment at Aldermaston in Berkshire, a two-fold excess in childhood leukemias were discovered [5].

Rather than admit that the risk factors of the ICRP model are in error, representatives of the nuclear establishment in Europe have entrenched themselves in the position that the research is in error and that the infant leukemia clusters are a fabrication. How do they defend this position? They say that the “doses” in the vicinity of the studied nuclear facilities are simply too low to be responsible, based on the “accepted” risk models of the ICRP.

All the analyses of causality in the case of nuclear site clusters rely exclusively on the ICRP risk model to show that the calculated doses to the children or their parents were insufficient to have been the cause of the disease since the linear ICRP model did not predict the leukemias or cancers” [6].

According to the analysis of the ECRR, the numerous studies of childhood leukemia clusters in many parts of Europe confirm errors in the risk estimates of the ICRP models. When the doses to the population living in proximity to these installations are plugged into the model, wide discrepancies emerge between the expected number of cases of childhood leukemia and those actually observed. A 100 to 300-fold error in the risk estimates are evidenced by the leukemia clusters around Sellafield. A 100 to 1,000-fold error is observed from the clusters around Dounreay in the UK and La Hague in France. And a 200 to 1,000-fold error is apparent from studies of Aldermaston/Burghfield, Hinkley Point, Harwell and Chepstow in the UK, Kruemmel and Julich in Germany, and Barsebaeck in Sweden. From the 11 studies which it cites, the ECRR calculates that the probability that the excess leukemia is due to coincidence rather than being directly related to radioisotope pollution is less than 0.000000000001 (1 in one million million.)

The confirmation of cancer and leukemia clusters in children living near nuclear sites has put considerable pressure on the scientific models of the ICRP and led to a dissonance between the model and observation that cannot be accommodated within a scientific paradigm” [6].

In 2007, the European Journal of Cancer Care published an article which further strengthened the conclusions reached by the ECRR. In “Meta-Analysis of Standardized Incidence and Mortality Rates of Childhood Leukemia in Proximity to Nuclear Facilities” [7], Baker and Hoel confirmed that rates of leukemia in children are elevated near nuclear installations. Reviewing seventeen studies which covered 136 nuclear sites in the UK, Canada, France, the USA, Germany, Japan and Spain, the authors found that, depending on the distance of the child’s home to the nuclear facility, the death rates from leukemia for children up to the age of nine were elevated between five and twenty-four percent. For children and adults aged zero to twenty-five, increased death rates ranged between two to eighteen percent. Regarding the incidence of leukemia, rates were elevated between fourteen and twenty-one percent in children zero to nine years old. When the age group zero to twenty-five was considered, the incidence rate of leukemia was elevated between seven and ten percent. Exercising caution, the authors couched there conclusions with this observation: “The meta-analysis was able to show an increase in childhood Leukemias near nuclear facilities, but does not support a hypothesis to explain the excess.” Relevant to the thesis of this chapter was the observation by Baker and Hoel that the dose-response studies they reviewed did not show excess rates of leukemia near nuclear facilities. In other words, the current dose-response model fails to accurately depict reality.


[1] Gardner M.J., Hall A.J., Snee M.P., et al. Methods and Basic Data of Case-Control Study of Leukemia and Lymphoma Among Young People near Sellafield Nuclear Plant in West Cumbria. British Medical Journal. 1990; 300:29-34.

[2] Heasman M.A., Kemp W.I., Urquhart J.D., Black R. Childhood Leukemia in Northern Scotland. Lancet. 1986; i:266.

[3] Viel J.F., Poubel D., Carre A. Incidence of Leukemia in Young People around the La Hague Nuclear Waste Reprocessing Plant: A Sensitivity Analysis. Statistics in Medicine. 1996; 14: 2459-2472.

[4] Viel J.F., Richardson S., Danel P., Boutard P., Malet M., Barrelier P., Reman O. Carré A. Childhood Leukemia Incidence in the Vicinity of La Hague Nuclear-Waste Reprocessing Facility (France). Cancer Causes and Control. 1993; 4(4):341-343.

[5] Busby C., Scott Cato M. Death Rates from Leukemia are Higher than Expected in Areas Around Nuclear Sites in Berkshire and Oxfordshire. British Medical Journal. 1997; 315:309.

[6] European Committee on Radiation Risk (ECRR). Recommendations of the European Committee on Radiation Risk: the Health Effects of Ionising Radiation Exposure at Low Doses for Radiation Protection Purposes. Regulators' Edition. Brussels; 2003.

[7] Baker P.J., Hoel D.G. Meta-Analysis of Standardized Incidence and Mortality Rates of Childhood Leukaemia in Proximity to Nuclear Facilities. European Journal of Cancer Care. 2007; 16(4):355-363.

Thursday, October 7, 2010

The Trial of the Cult of Nuclearists: EXHIBIT F

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.


The Cult of Nuclearists stands accused of perpetrating a fraud against the entire human race. Were the prosecution to rest its case at this point, the evidence presented in Exhibits A through E easily might be dismissed as toothless, theoretical arguments. Thus, before concluding, indisputable proof needs to be submitted to substantiate the charge that in many cases the risk factors for radiation induced disease are in error and the science of radiation effects has been intentionally corrupted. The information to be presented here will bear witness that the radiation protection community has allowed some monumental flaw to persist in current approaches to radiation safety, either through perpetuating defective models or a basic misunderstanding of radiation effects, ineffectual oversight as to the true extent of population exposure, insufficient epidemiological investigation or intentional malfeasance. When it is proven that levels of radiation in the environment deemed “permissible” are ruining human health, the science of radiation protection as currently practiced will stand exposed as counterfeit and duplicitous. This single crime has sired millions more, for it has given license to government and industry to deploy weapon systems and technologies that contaminate the Earth, invisibly sickening and killing untold numbers of unsuspecting victims.

According to the ECRR, there exists unequivocal evidence within the public domain that proves that the ICRP model of radiation effects is plagued by fundamental errors with regards to low levels of internal contamination. These errors lead to an underestimation of health detriment in the wake of a radiation release. The clearest example of these deficiencies surfaced after the accident at Chernobyl in 1986. As the clouds of fallout wafted around the planet, most governments broadcast reassurances to their anxious citizens that there was no cause for concern, that expected doses would be too low, based on current standards of radiation protection, to be medically significant. In most locales throughout the world, caution was not advised and people were informed that it was perfectly safe to continue to consume fresh meat and produce, dairy products, and unfiltered water from surface sources. This lackadaisical approach to radiation safety allowed the unnecessary internal contamination of unsuspecting bystanders and produced elevated rates of illness in many populations. What came to light in years subsequent to the accident was that children who received exposure to Chernobyl fallout, while still in the wombs of their mothers, experienced an elevated risk of developing leukemia by the time of their first birthday. In countries where unimpeachable data was collected for levels of fallout deposited in the environment, doses to the population, and the incidence of childhood leukemia, an unmistakable, uniform trend emerged: the cohort of children born during the 18-month period following the accident suffered increased rates of leukemia in their first year of life compared to children born prior to the accident or to those born subsequent to the accident after the level of possible maternal contamination had sufficiently diminished. This was confirmed in five studies conducted independently of one another: in Scotland [1], Greece [2], the United States [3], Germany [4], and Wales [5]. In calculations prepared by the ECRR, the probability that it was a chance occurrence that increased incidences of leukemia appeared in five different countries during the period of heaviest fallout from Chernobyl was less than 0.0000000001 (one in 10 billion). Low levels of internal exposure from Chernobyl was the indisputable cause of the childhood leukemia clusters.

In the UK, the National Radiological Protection Board measured and assessed the doses received by the populations of Wales and Scotland. Through environmental monitoring, they compiled data on the levels of Chernobyl fallout in the air, on the ground, and in food, milk, and water. Based on this information, they estimated the average level of exposure for members of the population. Plugging these dosages into their models of radiation effects, they calculated that no measurable harm was expected in the UK from the fallout of Chernobyl. To confirm or refute this assessment, Dr. Chris Busby and Molly Scott Cato undertook an investigation of the accuracy of the risk estimates of the NRPB as they applied to infant leukemia. Drawing upon the post-Chernobyl data collected by the NRPB and applying to it risk estimates for radiation-induced infant leukemia based on ICRP models previously published by the NRPB, they compared the expected number of cases of infant leukemia to the known incidence of childhood leukemia in one-year-olds born in the 18 months after the accident. This investigation was published under the title of “Increases in Leukemia in Infants in Wales and Scotland Following Chernobyl: Evidence for Errors in Statutory Risk Estimates.” What Busby and Scott Cato discovered humiliated the pronouncements of the NRPB. The incidence of infant leukemia in the combined cohorts of Wales and Scotland exceeded that predicted by 3.8 times. According to the authors, “Applying ICRP's risk factors to known levels of contamination from Chernobyl reveals 100 times less infant leukemia than actually found” [emphasis added] [5]. (As this cohort ages, further incidences of leukemia may prove that the accepted risk factors are even further off the mark.) The authors examined an alternative explanation, that the leukemias did not result from fetal exposure in the womb but from preconception exposure to radiation by the fathers. Under this scenario, the accepted risk factors were in error by approximately 2000 times. Simply stated, the NRPB models were proven to be in error. They substantially underestimated the hazard of the low levels of Chernobyl fallout on the health of developing children in utero. As stated by the ECRR:

The committee accepts that the infant leukemia results represent unequivocal evidence that the ICRP risk model is in error by a factor of between 100-fold and 2000-fold for the type of exposure and dose, the latter figure allowing for a continued excess risk in the cohort being studied. The committee notes that it will be necessary to follow the cohort as it ages” [6].

Richard Bramhall of the Low Level Radiation Campaign analyzed the data on infant leukemia in Wales and Scotland after Chernobyl presented in the paper written by Busby and Scott Cato [7]. He made the following observation which further condemns the accepted models for radiation-induced childhood leukemia:

“In the case of infant leukemia, doses from Chernobyl should have produced far less than one additional case in the populations of Wales and Scotland. (To spare you the mental anguish of trying to imagine a fraction of a case of leukemia, I can tell you that all this means is that you'd have to investigate the cancer registrations for a population more than 50 times as big in order to expect even a single baby with leukemia caused by the radiation.)

But Busby and Scott Cato looked at the figures and found that the rate had jumped quite sharply — 14 babies were diagnosed in the two years following Chernobyl. The average in a two-year period before it was 4.2, so finding 14 meant there were 9 or 10 extra cases.

We don't know exactly how the radioactivity made these babies ill.

Was it because it crossed their mothers' placentas?

Or because it affected them after they were born?

Or because the dose to their fathers' balls had mutated the sperm before they were even conceived?

There are different risk factors for these different types of exposure routes.

After doing some simple arithmetic with the figures in Busby and Scott Cato's paper we can display the implied errors like this:

If the damage was done by the placenta-crossing dose, NRPB's prediction was about 72 times too small;

if it was the postnatal effect, the prediction was 132 times too small;

and if it was the preconception dose to the fathers' testes, NRPB was out by a whacking 2,390”.

The Low Level Radiation Campaign [8] published an accompanying graph to visually depict the disparity between the established risk factors for infant leukemia and the actual incidence of the disease from the five separate studies of the post-Chernobyl environment. The vertical axis of the graph represents the percentage of increase in cases of infant leukemia in the 20 months following the accident compared to the period before April 26, 1986 and the period after January 1988. The horizontal axis represents the doses, in millisieverts, received by the exposed population. It is important to note that these doses were derived from environmental monitoring of cesium fallout. Cesium, which emits highly penetrating gamma rays, is relatively easy to detect and its deposition over wide areas can thus be easily mapped. Monitoring this radionuclide provided investigators with a streamline method for estimating dosages to the exposed populations. According to the LLRC, however, this methodology may actually be flawed when determining the health effects produced from other radionuclides in the environment:

But the very fact that it [cesium] is so penetrating means that its energy deposition (in the form of ionizations) is spatially well distributed in tissue, so its health effects are likely to conform with the external irradiation models. It is, moreover, soluble and does not form particles. The Chernobyl reactor fire produced other isotopes (including strontium-90) as well as microscopic particles of reactor fuel which traveled across Europe and beyond, exposing everyone in the path of the cloud to inhalation and ingestion. There is no reason why the health effects should conform with expectations based on cesium deposition.”

The LLRC emphasizes that the doses, as shown in the graph, between 0.02 and 0.2 millisieverts represent levels below annual exposure to natural background radiation. The implication is that “dose” at this low level might not mean anything at all and that health detriment is produced by extremely low levels of internal contamination by radionuclides. Further, the infant leukemia data suggests that, far from being innocuous, natural background radiation may be the causative agent for some small fraction of human cancers. In the graph, the dotted line just above the horizontal axis represents the expected increase in infant leukemia according to currently accepted ICRP models based on exposure to external radiation. As the LLRC notes, the dotted line

“...slopes up towards a point representing a 40% increase at a dose of 10 millisieverts (This is five times natural background, and the graph would have to be almost a meter wide to show it). The origin of this yardstick is cancer deaths in children after their mothers had been X-rayed during their pregnancy.”

The findings from Chernobyl flatly disprove the validity of this model. Doses much smaller than 10 millisieverts produced much greater increases in infant leukemia than were expected based on the yardstick mentioned in the quotation. Babies in Greece received a dose of only 0.2 millisieverts, and yet a 160% jump in the number of cases of infant leukemia was demonstrated there. Similarly, babies in Germany receiving a dose of 0.071 millisieverts showed an increased incidence of 48%. In Wales and Scotland, the doses were 0.08 millisieverts and the incidence of infant leukemia jumped over 200%.

Richard Bramhall of the Low Level Radiation Campaign has commented on the infant leukemia studies and compared the doses received from Chernobyl to those received by the residents of Seascale living near the Sellafield nuclear-fuel reprocessing facility. If the lower doses from Chernobyl produced elevated rates of infant leukemia, then this is indisputable evidence that the higher doses to the population from Sellafield pollution could have produced the cluster of infant leukemia in the vicinity of Seascale. Further, when the actual number of cases of infant leukemia is compared to that predicted by the currently accepted risk factors, the glaring inaccuracies of current models come sharply into focus. According to Bramhall:

In the parts of the UK mainly affected by Chernobyl fallout, the dose wasabout 80 microSieverts (i.e. 1250 times smaller than at Seascale); two separate studies showed [for infant leukemia] a 3-fold excess (Scottish infants) and a 3.6 excess (Scottish and Welsh infants combined). The implicit error in conventional risk factors is roughly 720-fold. In Germany there was a 1.6-fold excess and dose was 71 microSv (1400 times smaller than at Seascale). Implied error 450-fold. In Greece, there was a 2.6-fold excess and dose was 280 microSv (350 times smaller than Seascale). Implied error 300-fold. In UK data obtained by CERRIE, there was a 1.4-fold excess and the dose was 40 microSv (2500 times smaller than Seascale). We believe that these findings stack up to undermine ICRP's credibility.”

Earlier in this chapter, it was mentioned that representatives of the Cult of Nuclearists vehemently deny that nuclear pollution from the Sellafield reprocessing facility is responsible for the cluster of childhood leukemia found in the nearby community of Seascale. Leukemia in the 0-14 year-old age group in Seascale shows a 12-fold excess compared with the rate of the disease for the UK as a whole. According to COMARE, on the basis of current models, the doses to the population were 300 times too small to be responsible for the observed incidence of leukemia. But look what the post-Chernobyl data has to say about this. It confirms that current models are incorrect to approximately this margin of error.


[1] Gibson B.E.S., Eden O.B., Barrett A., Stiller C.A., Draper G.J. Leukemia in Young Children in Scotland. Lancet. 1988; 2(8611):630.

[2] Petridou E., Trichopoulos D., Dessypris N., Flytzani V., Haidas S., Kalmanti M.K., Koliouskas D., Kosmidis H., Piperolou F., Tzortzatou F. Infant Leukemia After In Utero Exposure to Radiation From Chernobyl. Nature. 1996; 382:352-353.

[3] Mangano J.J. Childhood Leukemia in the US May Have Risen Due to Fallout From Chernobyl. British Medical Journal. 1997; 314:1200.

[4] Michaelis J., Kaletsch U., Burkart W., Grosche B. Infant Leukemia After the Chernobyl Accident. Nature. 1997; 387:246.

[5] Busby C, Scott Cato M. Increases in Leukemia in Infants in Wales and Scotland Following Chernobyl: Evidence for Errors in Statutory Risk Estimates. Energy and Environment. 2000; 11(2):127-139.

[6] European Committee on Radiation Risk (ECRR). Recommendations of the European Committee on Radiation Risk: the Health Effects of Ionising Radiation Exposure at Low Doses for Radiation Protection Purposes. Regulators' Edition. Brussels; 2003.

[7] Bramhill R. Averaging -- ICRP”s Fatal Flaw. Adapted from a talk given to a Welsh Anti-Nuclear Alliance meeting in Chepstow, Wales, February 23, 2001.

[8] Low Level Radiation Campaign (LLRC). Infant Leukemia After Chernobyl. Radioactive Times: The Journal of the Low Level Radiation Campaign. 2005; 6(1):13.

Monday, October 4, 2010

The Trial of the Cult of Nuclearists: SCAM NUMBER THIRTY-NINE

What follows is the continuation, in serial form, of a central chapter from my book A Primer in the Art of Deception: The Cult of Nuclearists, Uranium Weapons and Fraudulent Science.

SCAM NUMBER THIRTY-NINE: Use the risk factors to structure the perception of the health consequences of a radiation release.

In Edgar Allan Poe’s short story, The Masque of the Red Death, all the influential people of a country assemble for a masquerade ball in the castle of a nobleman. Outside, a plague is ravaging the less fortunate population. Secure in their presumption that they are immune to the tribulations taking place beyond their walls, all are horrified to discover when they remove their masks, that Death has been an uninvited guest within their midst throughout the entire gathering. This, their final realization, marks the moment of their demise.

For purposes of this discussion, we must ask what costume, today, is Death wearing? Death is disguised by the risk factors published by the radiation protection agencies. We fail to recognize Death in our midst because it is so craftily concealed. Before this discussion proceeds, a single point needs to be hammered home. When estimates are manufactured for the number of people injured by nuclear weapon testing, how are the figures computed? On the basis of the risk factors! When a radiation accident takes place, what is used to determine the likelihood of illness to those dwelling downwind? The risk factors! Before commercial nuclear power plants are licensed, what criteria are used to determine the amount of radionuclides that can be legally discharged and the likely health effects of these to the surrounding population? The risk factors! For people employed in the nuclear industry, how is potential hazard to their health estimated? The risk factors! When modeling different accident scenarios at radioactive waste repositories, how is health detriment of those potentially exposed determined? The risk factors! When computing the possible hazards of a breach of containment accident during transport of radioactive materials along highways or railroad lines, how are possible casualty figures derived? The risk factors! On what basis is the hazard to health estimated from incorporating low-level waste into consumer products? The risk factors! When cancer patients receive radiation therapy, how are their chances for another cancer being induced by their therapeutic dose of radiation calculated? The risk factors! How are hazards to our own troops or enemy civilians evaluated when designing and deploying uranium weapons? The risk factors! When estimating collateral injury to the surrounding population from the proposed deployment of nuclear bunker-buster bombs, what information is necessary for such calculations? The risk factors! How are the number of radiation deaths produced in the varying scenarios of nuclear war fighting during World War III determined? The risk factors! The risk factors legitimize the entire nuclear enterprise. Human beings tolerate technologies that cause radiation exposure solely on the basis of their belief that this exposure represents minimal risk.

Scam Number Thirty-Nine is the preeminent scam, the reason for being of all the other scams. It lies at the heart of all the mischief that has infiltrated and corrupted the science of radiation protection. By the elaborate swindle deconstructed within these pages, the Cult of Nuclearists has fabricated inaccurate risk factors and then used these inventions to veil its misdeeds before the public. To fully appreciate the insidious role played by the risk factors in blinding humanity to true radiation effects across populations, one must get a feel for the profound indeterminacy that accompanies a radiation release. Due to the nature of the phenomenon, the impact of vented radioactivity on public health is clouded in ambiguity. Once liberated, radioactive atoms invisibly migrate through the environment at the whim of ever-changing meteorological and geophysical forces. From their point of origin to their ultimate abode, no one knows their fate. Extensive environmental monitoring can provide a map of patterns of dispersal and potential avenues for contamination of the food chain, but this in itself will not divulge who was contaminated and to what extent. Those contaminated will never know they have absorbed radiation that may undermine their health. The most meticulous scrutiny will never disclose the fate of each radioactive atom as it courses through their bodies. When an atom undergoes radioactive decay, no one will witness the molecular consequences of the event or the possible genetic damage inflicted on a cell. When a cancer develops decades later, the victim will never realize that he or she was a casualty of a radiation accident.

With the exception of incidents that produce acute radiation syndrome, radiation injury in the wake of a radiation release is delayed and invisible. Only years or decades after an exposure event do indications of injury begin appearing, if anyone is bothering to look for them, in the form of an increased incidence of naturally occurring diseases. In the aftermath of a Chernobyl-type accident, perhaps the first indications of harm to a population are a growth in the number of miscarriages, stillbirths and birth defects. Increased rates of leukemia among children who were in the womb during or immediately after the event may begin appearing during childhood or adolescence. Among those who were children at the time of the accident, the dietary absorption of radioiodines will increase the number of thyroid abnormalities and thyroid cancers diagnosed within a few years of exposure. The next disease that may be identifiable as radiation-induced is leukemia throughout the population, with rates beginning to climb perhaps as soon as five years after the event, and continuing to climb as the population ages. Increases in the frequency of other types of cancer may go unnoticed for decades due to their long latency periods.

The conundrum facing the epidemiologist is how to determine the rates of those illnesses in the population which are radiation-induced against the pool of identical illnesses that occur naturally or from other environmental toxins. Due to normal statistical fluctuations in the frequency of these diseases over time, trends are not easily identifiable, or if they are, may require the passage of decades for meaningful elucidation. In some instances, what further complicates assessing the health consequences of a radiation accident is the sparsity of accurate data. Particularly in underdeveloped countries, illnesses may be misdiagnosed, causes of death may not be properly identified or may go unrecorded, and statistics on morbidity and mortality may not be gathered or may remain incomplete. Not to be overlooked is the politically motivated corruption of accurate data sampling. As noted in Scam Number Twenty-One, cancer registries are susceptible to fraud, or as in the case of Ukraine after Chernobyl, Soviet authorities forbade doctors from including leukemia in their diagnoses. Finally, definitive and indisputable conclusions of radiation effects on populations are a rarity among epidemiological studies. Due to political clashes between pro-nuclear and anti-nuclear factions, studies angering one camp are routinely challenged and refuted by researchers of the opposing camp. Controversies inevitably erupt in the wake of studies that either underestimate or overestimate the number of radiation-induced casualties. As opposing camps fight to a standoff, consensus opinion is never achieved, and the public is left in bewilderment as to what really is the truth.

Given the formidable array of forces that delay or prevent a clear-cut assessment of the public health consequences of a radiation release, how do human beings in the immediate aftermath of environmental contamination arrive at an understanding of what has taken place? What tools do they have at their disposal for rapidly interpreting the event’s impact? Public anxiety demands timely information. People are not going to wait patiently for decades to see if their health has been compromised. They want immediately to know how much radiation has been liberated into the environment, in what direction it dispersed, and if they should evacuate. They want to know about the safety of their food and water supply. They want to know who was exposed, what were their dosages, and what are the risks these dosages pose for initiating radiation-induced illnesses. How are answers to these pressing questions derived?

By this time, the answer to this fundamental question is self-evident: the risk factors! These are the lenses through which the ambiguities of a radiation emergency are brought into focus. They are the instrument used to structure the perception of a radiation release in the public mind.

As the history of radiation accidents has repeatedly demonstrated, the first response of representatives of the Cult of Nuclearists to a radiation emergency is to downplay or completely discount any potential threat. By this response, they attempt to avert panic, discourage social unrest and preserve confidence in the Cult’s long-term nuclear agenda. To reinforce faith in the safety of nuclear technology, interpreters of the event — most often government spokesmen, apologists for the nuclear industry and media personalities — grab public attention and offer a sanitized version of the incident. Although radiation effects are profoundly difficult to discern and may take decades to decipher, these interpreters fabricate an instantaneously clear picture of what has transpired. This concoction, to attain credibility and be above suspicion, requires grounding on accepted scientific principles. This is where the radiation protection agencies enter into the scheme. Their science is recruited to legitimize the version of reality being invented. Elevated to the status of oracle, the risk factors are employed to divine the health consequences to the contaminated population. Following the protocols published by the radiation protection agencies, researchers mathematically model the radiation release. Based on estimates of the amount of radiation dispersed, the radionuclides involved, their chemical forms, prevailing weather patterns, dietary habits of the population, the number of people exposed and so forth, dosages to the exposed population are reconstructed. On the basis of these assigned dosages, the potential types of illness and their frequency can be predicted based on the established risk factors. Without having to wait for decades to investigate what actually happened, a picture can be painted within hours or days of what supposedly will likely happen. Needless to say, the correctness of these speculations is wholly dependent on the accuracy of the assigned dosages and the fidelity of the risk factors.

The devilment lying at the heart of this elaborate charade is the authority bestowed upon the risk factors to accurately predict radiation effects. Consecrated by the high priests of the radiation protection community, the risk factors have been elevated to inviolable law. They are credited with the power of prophecy, foretelling the limits of the health consequences of released radioactivity. This point is essential to grasp. By sleight of hand, the portrait of a radiation event is painted by the risk factors. This is the image that reaches the public’s awareness and shapes perception of the event. Distracted by this facsimile, the uninitiated fail to notice that the actual health toll remains undetermined or may be woefully out of sync with the whitewashed imitation.

The Cult of Nuclearists has built its castle upon the risk factors. To mollify concerns when radiation is released, the Cult of Nuclearists desperately requires an unassailable tool by which to paint a benign image of the event in the public consciousness. Groomed specifically for this purpose by the corrupted radiation protection agencies are the risk factors. These carefully crafted mathematical fictions are propaganda instruments designed to reassure a wary public that released radiation is no cause for alarm. They are the mask that disguises the plague unleashed upon the earth. The risk factors structure the perception that the guardians of radioactivity are adequately protecting the welfare of humanity. The public tolerates their mismanagement and mishandling of radioactive material based on their limited understanding of radiation effects and their trust in the accuracy of estimates of risk presented in the popular media.

This mischievous method of damage control is easily seen in the way that the radiation protection agencies are attempting to sanitize the Chernobyl catastrophe. By their approach, a dose is fabricated for a defined population, the risk factors are applied to this dose, and presto, the health toll of the accident immediately materializes out of nothing. To quote the ECCR:

"UNSCEAR 1993 gives the total committed effective dose from the Chernobyl accident to the world population as 600,000 person Sieverts. The ICRP risk factor of 0.05/Sv would predict 30,000 fatal cancers in the world from this; as UNSCEAR 2000 points out, such an increase would be statistically invisible."

As an exercise in epistemology, it is worth analyzing the meaning of this statement. UNSCEAR is not declaring that 30,000 fatal cancers will be produced from the accident at Chernobyl. What they are saying is something entirely different. They are saying something about their models, not reality. They are declaring that, according to their premises, the cancer fatalities that emerge at the other end of their equations is 30,000. BIG DEAL! We could start with different premises, apply other models, and arrive at different conclusions. In this manner, Gofman predicts 970,500 fatal cancers from external exposure to the single radioisotope cesium-137 released from Chernobyl. And the ECRR, employing its own models, predicts that over the next 50 years, in Belarus alone, an excess of 1,200,000 fatal cancers will occur, and worldwide, the total will reach 6,000,000. The conflict between different researchers is over models, not reality.

So, how many fatal cancers will REALLY occur as a result of the Chernobyl accident? No one on the face of the Earth has a clue!

Given this indeterminacy, the previous question needs to be reformulated: Who is in possession of the most trustworthy models for predicting radiation effects from Chernobyl? The Cult of Nuclearists ardently strives to convince the world that it is the ICRP, NCRP, NRPB, UNSCEAR, BEIR, and so forth. These are the organizations that have been sponsored and financed by the nuclear establishment and upon whom eminence and respectability have been conferred. Their version of reality is the one designed to be accepted by all inquirers. However, as we shall see in Exhibit F, when contaminated populations are investigated epidemiologically rather than mathematically, the rate of radiation-induced illness is greater than that forecast by the risk factors. This unfortunate intrusion of reality is the Achilles heel of the whole corrupted science of radiation effects and the slayer of the false models that have been intentionally crafted to underestimate the extent of injury suffered by humanity from nuclear pollution.

The risk factors have become so enthroned as the diviners of biological effects that they are frequently called upon to testify against observable health consequences that flatly contradict their accuracy. An excellent example is reported by Busby in Wings of Death. In the mid-1980s, the Committee of Medical Aspects of Radiation in the Environment (COMARE) concluded that radiation was not responsible for the confirmed leukemia cluster in the vicinity of the Sellafield nuclear installation first reported by Yorkshire TV. Despite the fact that the incidence of leukemia in the area was 10 times the national average, the committee insisted that radiation was not the causative agent. They justified their conclusion on the basis of the risk factors. Essentially, they said that given the presumed dosages, the observed leukemias could not possibly be radiation-induced because the risk factors did not predict them. In this instance, on the basis of the risk factors alone, radiation was absolved of the responsibility of contributing to the obvious illness in the population. The committee was forced into upholding this dubious conclusion by an embarrassing dilemma. Confronted with the leukemia cluster, they were cornered into having to entertain one of two reasonable but politically unacceptable explanations. One, the dosages to the population were greater than modelled, perhaps due to unreported ventings of radiation from the facility. To endorse this conclusion would have called into question Sellafield’s operating procedures. Two, the risk factors were in error. This determination would have compromised the credibility of the radiation protection agencies. To launder a potential threat to the credibility of the Cult of Nuclearists, the committee was left with no politically correct alternative other than using the risk factors to “prove” that the leukemia cluster was not caused by radiation exposure.

This issue is far from being just an intellectual game. It has real world repercussions that impact on human health. For instance, when the radioactive plume from Chernobyl was circling the Earth, citizens in the UK and the US received no warning of possible contamination to their food supply. This cavalier attitude was justified on the basis that the assumed accumulated dosages would be too low and that the risk factors applied to these dosages predicted that no threat to health existed. Evidence later surfaced that this presumption was woefully in error. In Deadly Deceit, Gould and Goldman provide convincing evidence that Chernobyl fallout was responsible for increased infant mortality in the US and significant increases in the death rate of the very old and those suffering from infectious diseases whose immune systems had been previously compromised. As will be revealed in Exhibit F, indisputable evidence also exists of an increased incidence of childhood leukemia in the US and the UK from the Chernobyl fallout which has been deemed by officials to have produced dosages too low to warrant concern.

The devastation of depleted uranium on the health of veterans and enemy noncombatants is destined to expose the lies buried within the science of radiation effects. All the major defenses of DU weaponry penned to date have been based on the models upheld by the radiation protection agencies. Researchers calculate the amount of energy deposited in tissue by different quantities of internalized uranium. The derived doses are then “proven” to be of no consequence to health, an opinion based ultimately on the data from Hiroshima and the resulting risk factors developed by the radiation protection agencies. This methodology cleverly avoids one essential ingredient: It fails to include actual epidemiological studies of groups exposed to depleted uranium who subsequently developed illnesses. Here again, the risk factors are being used as a smokescreen to draw attention away from any possible connection between radiation exposure and real illnesses suffered by real people. This game is played to convince all inquirers that depleted uranium is harmless. Given the rules of the game, this is the inevitable and logical conclusion. But the rules are about to change. Once people awaken to the fact that the science of radiation effects has been intentionally corrupted, all conclusions as to the supposed harmlessness of low-level radiation in the environment will have to be reexamined.

“According to estimates of risk published by the radiation protection agencies, dosages to the population were too low to warrant concern!” Tirelessly, this refrain echoes around the world in the wake of every disclosed radiation release. Yes, we are told, mutagens and carcinogens have taken flight upon the winds, but no hazard exists, no one need be concerned. This carney game, played craftily for decades, is now an open book. The purpose of the Hiroshima Life Span Study is to define and delimit radiation effects in man. As this study matures, the data is continually massaged to produce conclusions acceptable to the Cult of Nuclearists. The types of illnesses observed in the Japanese study population and their frequency then become the basis for the risk factors developed by the radiation protection agencies. Studies are then sponsored by that Cult of Nuclearists designed to produce evidence that confirms the accuracy of the risk factors. Any investigators that produce results that call into question the veracity of the risk factors are vilified and marginalized; their work discredited and discounted for being outside the mainstream of “accepted” radiation science. Battle-lines form along any front that attempts to prove that more illness is produced in a population than that predicted by the risk factors. As long as the risk factors are upheld as an accurate depiction of reality, the swindle succeeds. When a new radiation event takes place, the tried and true damage control mechanism is activated. From the smorgasbord of scams rehearsed in this Exhibit, representatives of the Cult of Nuclearists pick and choose those most applicable to the situation. Artfully mixing together any number of the dosage scams, they contrive dosages for the exposed population that appear innocuous. By then applying the risk factors to these dosages, they “prove” that harm to public health was negligible or nonexistent. The hoax is artfully airtight.

A simple test should suffice to prove the truth or falsity of this allegation. If the Hiroshima Life Span Study is in fact honest, and if its findings can be applied to instances of internal contamination by radionuclides, and if the models of radiation effects promulgated by the radiation protection agencies faithfully mirror reality, then the risk factors should accurately forecast, within the limits of acceptable statistical error, the incidence of cancer in contaminated populations. If this is the case, no significant discrepancy should arise between the number of cancers predicted by the ICRP models and the actual number uncovered by epidemiological investigation. However, if the risk factors are shown to be inaccurate, what then? What if greater numbers of casualties are produced then those calculated by the accepted models of the radiation protection community? If evidence exists to this effect, then the whole house of cards of the Cult of Nuclearists comes tumbling down. It will prove that the risk factors, rather than being a tool in the service of truth, are being used as an instrument of deception.