More Options

From the Spectral to the Spectrum: Radiation in the Crosshairs

Feature

Jeanne Goldberg

Skeptical Inquirer Volume 42.5, September / October 2018


Wilhelm Conrad Roentgen shocked the world in 1895 when he produced the first radiograph of the human body: an X-ray of his wife’s hand. Whether intentional or not, the fact that he chose her hand for this epic experiment was fitting because the hand has, through the ages, been an important cultural symbol. In this case, the radiograph of Anna Roentgen’s hand signaled a landmark scientific achievement that was rapidly adopted in many countries.

Roentgen’s discovery occurred during the Victorian Era, a time when other scientific discoveries such as the telegraph, photography, the electric light bulb, the telephone, and radio were made, to name just a few. The Victorian Era, extending from 1837 to 1901, was characterized at the beginning by the strong, pervasive religion of the Church of England. The tension between religion and science was minimal in the early years. As Aileen Fyfe noted, “… religious faith and the sciences were generally seen to be in beautiful accordance … mediated by some form of theology of nature” (Fyfe 2012). This era was one of increasingly robust scientific activity, however, and as the twentieth century approached, the tension between science and religion intensified. Some scientists began to express their opposition to the dogma of Christianity while simultaneously a culture of scientific romanticism thrived. The fact that the general public often didn’t understand the scientific bases for the discoveries contributed to suspicion and paranormal explanations.

Interestingly, Roentgen’s discovery of radiography provided one of the focal areas of convergence of the scientific discipline of physics with this psychic, romantic, paranormal, pseudoscientific movement. Keith Williams, in the article “Ghosts from the Machine: Technologisation of the Uncanny in H. G. Wells” describes the ambivalent feelings of Victorians (the anxieties in addition to the optimism) as scientific discoveries were made. Knowledge of the electromagnetic spectrum enabled the understanding of visible light and color; however, the fact that some of the waves on the spectrum were invisible and could penetrate solid substances (e.g., X-rays) kindled a combination of fascination, fear, and “otherworldly” images. Williams refers to H.G. Wells’s 1898 book The Invisible Man and his other writings as examples of a “strange liaison between spiritualism and science.” In fact, the Society for Psychical Research (SPR), founded in 1882, whose membership consisted of scientists and philosophers, was an example of this type of liaison (Williams 2010).

Bettyann Holtzman Kevles states in Naked to the Bone: Medical Imaging in the Twentieth Century that “it is a historical irony that the discovery of rays that could penetrate clothing and skin and leave an image of living bones appeared in the most inhibited period in Western history, an era whose name Victorian has become synonymous with extreme sexual repression and bodily shame … the discovery of X-rays was one of the nails in the coffin of Victorian prudery” (Kevles 1998).

That historical account is intriguing, but it is also relevant to public opinion concerning various types of radiation that play central roles in our lives today. Radio, TV, mobile phones, microwave ovens, smoke detectors, and X-rays represent scientific applications of the electromagnetic spectrum that have transformed our lives. Although these technologies are widely used, nevertheless the public has expressed concern from time to time about the safety of cell phones and electromagnetic field radiation.

Briefly, electromagnetic radiation can be described as photons or massless particles traveling in a wave pattern at the speed of light. A range or spectrum of electromagnetic energy levels progressively extends from relatively low energy radio waves to microwave, infrared, visible, ultraviolet, X-ray, and gamma ray radiation at the high energy end.

The simple word radiation applies to the spreading out or transmission of energy, but to many individuals it refers to the more specific term ionizing radiation. X-rays and gamma rays (and alpha and beta particles), at the high energy end of the electromagnetic spectrum, are capable of ejecting electrons from their orbit around atoms or molecules, thereby creating ions. The radiation produced by X-ray machines is differentially absorbed in human tissues and in different materials, when used for industrial purposes, thereby generating images (NASA 2013).

Without delving into a detailed discussion of the specific units used to measure radiation absorption in humans, suffice it to say that the sievert (Sv) is a basic unit that is indicative of the relative biological effectiveness (RBE), the sensitivity of tissues to radiation exposure averaged over the body (Gregersen 1998). The sievert is roughly equivalent in biological effectiveness to a radiation absorbed dose of one gray (100 rads) of gamma radiation. The average American receives about 6.2 millisievert (mSv) per year (U.S.NRC 2017).

Half of the radiation dose (3 mSv) that the average American receives each year is natural background radiation, originating mostly from radon but also from cosmic rays and even our food and water. An airplane coast-to-coast trip can contribute 0.03 mSv to our exposure, and living in high plateaus of New Mexico or Colorado can contribute 1.5 mSv more radiation annually than would be the case if one lived near sea level (Radiology Info 2017).

Not surprisingly, public perceptions of the risks and benefits of ionizing radiation have been influenced by many factors. Hiroshima and Nagasaki were monumental events, of course, and the destructive capability of the atom was forever etched in the public mindset. The emotional overlay of the events dominated rational analysis. For example, although the average dose to the exposed individuals was approximately 200 mSv (Hendee and O’Connor 2012), there was limited public recognition of the fact that this large dose occurred at one time, not in smaller doses distributed over a longer period of time.

More recent nuclear disasters at Chernobyl and Fukushima have reinforced the public’s concerns regarding medical diagnostic imaging studies, nuclear power, and even food irradiation. Individuals inappropriately extrapolate the risks of being subjected to high levels of radiation from a nuclear disaster to the risks of submitting to a low dose X-ray imaging study. Undergoing an imaging study can lead to an unsuspected diagnosis and detection of disease at a curable stage, but it can also lead to an unsuspected diagnosis or detection of disease at a curable stage.

In spite of these concerns, the value of plain X-rays, computed tomography (CT), and other imaging studies that use ionizing radiation is largely recognized by the American public. In the United States, 800 million radiologic exams are performed annually (Ip 2017). Medical and dental X-ray procedures account for approximately 90 percent of the population dose from artificial sources of radiation (Phenomenon Called Radiation 1979). In 2006 the yearly per capita average radiation dose in the United States was 6.2 mSv, compared to a dose of 3.6 mSv in the early 1980s, with medical radiation being responsible for this increase (Hendee and O’Connor 2012).

Of interest are the approximate radiation doses from imaging studies and the approximate length of time that would be necessary for one to receive the comparable dose from natural background radiation (see Figure 1) (Radiology Info 2017).

In an excellent review article published in 2012 in Radiology titled “Radiation Risks of Medical Imaging: Separating Fact from Fantasy,” William Hendee and Michael O’Connor describe a series of scientific studies that the U.S. National Academy of Sciences commissioned over the past six decades, the BEIR (Biological Effects of Ionizing Radiation) reports. These studies focused mainly on the Japanese atomic bomb survivors, but persons working in nuclear industries, those exposed to medical radiation, and those exposed during the Three Mile Island and Chernobyl events were also studied. The authors examine the LNT (linear no-threshold) mathematical risk model, which has been in use since the 1920s to predict radiation effects. Briefly, the LNT model proposes a dose-response relationship between radiation dose and carcinogenesis in tissues. The implication is that even the tiniest dose of radiation has carcinogenic potential.

In 1946, the Nobel Prize in Physiology or Medicine was awarded to Hermann Muller, who had studied radiation’s effects on mutations (structural alteration of genes) in fruit flies. However, even at that time the LNT model’s validity was questionable because there appeared to be a threshold below which mutations did not occur; therefore, the underlying premise of the LNT model was compromised. Muller persuaded the BEIR Committee to adopt the LNT model, even in the face of doubts about its validity. The authors describe the inappropriateness of extrapolating the dose response curve from the large dose that Japanese bomb survivors received in one event downward to low doses delivered over time to individuals via medical imaging procedures, and they point out resultant irrational fears and anxieties that are spread in the general public by hyperbolical media publications (Hendee and O’Connor 2012).

In a separate Radiology review of the LNT model, Maurice Tubiana and others state that scientific advances in the understanding of radiation biology and carcinogenesis have rendered the LNT model obsolete. They have reviewed studies demonstrating that laboratory markers of DNA damage from ionizing radiation in use today are nonspecific and also that at low doses there are especially strong cellular defenses against the carcinogenic effects of radiation. They have concluded that in humans “there is no evidence of a carcinogenic effect for acute irradiation at doses less than 100 mSv and for protracted irradiation at doses less than 500 mSv.” Although the subject remains controversial, they totally reject the LNT model (Tubiana et al. 2009).

The Health Physics Society and the American Association of Physicists in Medicine (AAPM) have promoted a risk/benefit approach to diagnostic imaging. The AAPM discourages making predictions of hypothetical cancer incidence and deaths from low dose radiation by using epidemiological, population-based data derived from large populations that have received a single large dose of radiation. Their opinion is that the “risks of medical imaging at patient doses below 50 mSv for single procedures or 100 mSv for multiple procedures over short time periods are too low to be detectable and may be nonexistent” (Hendee and O’Connor 2012). Additional organizations that share this view include the International Organization of Medical Physicists, the UN Scientific Committee on the Effects of Atomic Radiation, and the International Commission on Radiological Protection (Calabrese and O’Connor 2014).

Even in view of the above, however, there is legitimate concern about keeping radiation doses as low as possible for medical imaging procedures, and this has led to development of enforceable safety guidelines. The potential impacts of ionizing radiation for children are also carefully considered. Organizations representing radiologists in the United States such as the Radiological Society of North America (RSNA) and the American College of Radiology (ACR) promote the responsible use of ionizing radiation in medical imaging procedures, and various state and federal governmental agencies have roles in checking safety and quality assurance.

Superstition and fear concerning radiation can be harmful to individuals, but on a global scale it can be potentially catastrophic. In an excellent historical review titled “Fear of Radiation Is Killing People and Endangering the Planet Too,” Theodore Rockwell, a nuclear energy pioneer who had worked on the Manhattan Project and was instrumental in the development of the Navy’s nuclear propulsion program, not only focused on potential harmful effects on individuals by avoiding diagnostic imaging studies but also on larger population-based hazards. Rockwell’s global perspective of the impacts of irrational fears of radiation, issued in 1998, was prescient. He was discouraged by the fact that coal-fired power plants continued to pollute the air with harmful particulate matter, contributing to tens of thousands of American deaths from respiratory problems when nuclear facilities could have provided clean, safe energy. The contributions of coal-fired plants to global warming, smog, acid rain, and other effluents distressed him. He was also appalled by the number of Americans sickened and killed by contaminated food, which could have been freed of pathogens by food irradiation. Rockwell is also dismissive of the LNT model, pointing out that we “live in a sea of natural radioactivity” and that the DNA damage that occurs normally as a result of free radicals in metabolic processes, for example, greatly exceeds (by ten million fold!) that from natural radiation (Rockwell 1998).

In 1999, Paul Kurtz, the founder of the Center for Inquiry, published a special report titled “Fears of the Apocalypse: The Escape from Reason.” This article focused on fears, superstitions, and hysteria surrounding the coming of the millennial year 2000. He clearly described prevalent forecasts of unprecedented optimism regarding prosperity and technological triumphs, but he also addressed the gloom-and-doom secular and religious millennial prophecies. Regarding the former, he stated: “Overhanging all of this is the sword of Damocles—nuclear energy ... . Anything related to radiation was considered diabolical” (Kurtz 1999).

In his book The Rise of Nuclear Fear, Spencer Weart describes imagery in the public mind dating back to biblical times of a powerful source of energy with huge beneficial potential on one hand or destructive potential on the other hand (i.e., “hideous death or miraculous life”) (Weart 2012).

David Ropeik has contributed a fascinating history of this schizophrenic view of nuclear energy among the American public. He attributes the public’s great expectations of radiation’s potential at the beginning of the twentieth century to recognition of the progress that had characterized the scientific and industrial revolutions. A story about a small quantity of uranium’s potential to “propel a steamship across the ocean” was typical, and most media coverage shaping the public’s views was supportive of radiation. Simultaneously, however, a few magical stories erupted concerning some of the dark aspects of radiation. Then Hiroshima and Nagaski shocked the world, instantly (“in a flash”) creating an image of the apocalypse, strongly reinforcing the public’s fears of radiation. These events actually signaled, or at least paralleled the beginning of public erosion of “faith in science and modern technology.” Although later studies revealed that lifetime cancer mortality rate increased less than one percent among the Japanese bomb survivors and that no biological effects appear to have occurred in those individuals who received less than 110 mSv, the fearsome images of the bombs would always be a potent reminder of the destructive potential of the atom (Ropeik 2012).

Multiple challenges to the development of nuclear energy were faced in the years following World War II. Despite President Eisenhower’s Atoms for Peace program, an attempt to develop nuclear energy, the environmental movement had concerns about radioactive fallout, and the AEC (Atomic Energy Commission) mishandled public doubts, contributing to the opposition to nuclear power development. Legal barriers were erected too. Ropeik also describes antinuclear rallies led by respected intellectuals such as Albert Einstein and Bertrand Russell who focused on banning nuclear weapons but had the effect of decreasing support for nuclear energy plants. In Ropeik’s opinion, the next iconic nuclear event, Three Mile Island, which happened in 1979, was responsible for essentially ending support for nuclear plant development in the United States (Ropeik 2012).

The Fukushima disaster in 2011 represented a confluence of unfortunate circumstances and was a focus of intense, even hysterical, reporting.In “Radiation Superstition,” Robert Hargraves referred to a UN scientific committee’s findings that none of the workers at the plant who died had been killed by radiation poisoning (Hargraves 2013). Although no deaths or cases of radiation sickness could be documented, more than 100,000 people were evacuated from their homes and there were more than 1,000 deaths associated with the evacuation. Suicide rates were higher than normal. This is tragic since experts have estimated that there were acceptably low levels of radiation in some areas that were evacuated and that overall the expected lifetime exposure that people living near Fukushima received from the event was less than 10 mSv, compared to their 170 mSv expected lifetime dose from natural background radiation (World Nuclear Association 2017).

The extreme measures that have been taken in Japan have had economic consequences in addition to contributing to the public’s fears and superstitions concerning radiation. Money has been wasted by preventing its citizens from returning to their homes. In addition, Japan is building dozens of new coal-fired plants to replace nuclear power, and liquefied natural gas is being imported in greater quantities, affecting Japan’s balance of trade negatively (Puko 2018).

Ann Stouffer Bisconti has studied factors affecting public attitudes toward nuclear energy, and she considers the following to be critically important: (1) imagery associated with nuclear energy, (2) accidents, (3) energy needs, (4) proximity of an accident or of a proposed plant, (5) perception of control, (6) political affiliation, and (7) understanding of nuclear energy principles. Many people don’t understand the physical principles of nuclear energy and therefore feel intimidated by it. One could make the point that many don’t fully understand basic principles of computers or even automobiles either and yet they don’t fear them. The obvious difference, of course, is our “hands-on” experiences with computers and cars. They are “up close and personal” to us, and we regard them as generally safe, essential elements of our lives.

So where do we stand now regarding nuclear energy? Over the past three decades, public opinion has steadily grown more favorable. According to Bisconti, 49 percent of the public favored nuclear energy in 1983, and in 2016, 65 percent favored it. She found that favorable opinions are based on perception of energy needs, environmental benefits of nuclear radiation, and low cost (once a plant is functioning). Negative opinions were predominantly based on safety concerns (accidents and radiation leaks). In addition, a focus on the renewable energy sources of solar and wind and a drop in gasoline prices have led many to exclude nuclear energy. Although many people expressed their support for nuclear energy, they did not support nuclear plant development in their own communities (Bisconti 2017), the “not in my backyard” philosophy.

Why do radiation fears and superstitions matter in the big picture—the global picture? They matter because they contribute to a global existential threat, climate change. With the advent of alternative sources of energy such as solar, wind, and other new technologies on the horizon, there is some promise that our carbon dioxide levels, the highest in over 800,000 years, can be addressed (PBS.org 2018.) Before this can occur, however, the pernicious narrative of fear and superstition surrounding radiation must be revealed, understood, and disentangled from the realm of pseudoscience.



References