More Options

The Conspiracy Meme

Column

Ted Goertzel

Volume 35.1, January/February 2011

Many of these theories are clearly absurd, but some are plausible and others actually contain elements of truth.

Conspiracy theories are easy to propagate and difficult to refute. Having long flourished in politics and religion, they have also spread into science and medicine. It is useful to think of conspiracy theorizing as a meme, a cultural invention that passes from one mind to another and thrives, or declines, through a process analogous to genetic selection (Dawkins 1976). The conspiracy meme competes with other rhetorical memes, such as the fair debate meme, the scientific expertise meme, and the resistance to orthodoxy meme.

The central logic of the conspiracy meme is to question, often on speculative grounds, everything the “establishment” says or does and to demand immediate, comprehensive, and convincing answers to all questions. Unconvincing answers are taken as proof of conspiratorial deception. A good example is the film Loose Change 9/11: An American Coup (Avery 2009), which started out as a short fictional 2005 video about the World Trade Center attacks that was marketed as if it were a truth-seeking documentary. The 2005 video went viral on the Internet and has been viewed by over ten million people. Loose Change raises a long series of questions illustrated by tendentious information, such as the fact that the fires in the World Trade Center were not hot enough to melt steel. But no one had claimed that the steel had melted, only that it had gotten hot enough to weaken and collapse, which it did. The video presents the fact that the U.S. Internal Revenue Service (IRS) is keeping certain people's tax returns secret, set to an ominous musical background suggestive of evildoing-despite the well-known fact that the IRS keeps everyone's tax returns secret.

When an alleged fact is debunked, the conspiracy meme often just replaces it with another fact. One of the producers of Loose Change, Korey Rowe, stated, “We don't ever come out and say that everything we say is 100 percent [correct]. We know there are errors in the documentary, and we've actually left them in there so that people [will] discredit us and do the research for themselves” (Slensky 2006).

When the conspiracy meme is reinforced by a regular diet of “alternative” videos and one-sided literature, it can become a habitual way of thinking. People who believe in one conspiracy are more likely to believe in others (Goertzel 1994; Kramer 1998). A young self-declared conspiracy theorist challenged me to debate one conspiracy theory per week with him, including theories about genetically modified (GM) foods, vaccine neurotoxins, AIDS, and September 11, 2001. He expressed his “true belief” that there is a “kernel of truth” in almost every conspiracy theory and claimed that once you understand the kernel, all you have to do is “connect the dots to make a picture.”

Conspiracy theorists have connected a lot of dots. The ninety-two conspiracy theories described in a recent handbook (McConnachie and Tudge 2008) range in topic from Tutankhamen and the curse of the pharaoh, the Protocols of the Elders of Zion, and satanic ritual abuse to the alleged scheming of the Council on Foreign Relations, the Trilateral Commission, and the British royal family. Other theories involve religious cults, alien abductions, or terrorist plots. Some are merely amusing, but others have fueled wars, inquisitions, and genocides in which millions of people died.

Scientific and technological conspiracies often allege the misuse of science by government, the military, or large corporations, and they include bizarre claims that the military suppressed technology that could make warships invisible, automobile or oil companies possess hidden technology that can turn water into gasoline, and the military is secretly in cahoots with space aliens. Conspiracy theorists have argued that the AIDS virus was deliberately created as part of a plot to kill black or gay people, the 1969 Moon landing was staged in a movie studio, and dentists seek to poison Americans by fluoridating public water supplies. Other theorists claim that corporate officers and public health officials suppress evidence that preservatives in vaccines cause autism and silicone breast implants cause connective-tissue disease (Specter 2009; Wallace 2009).

Conspiracy theories include claims that a major drug company hid reports stating that its leading anti-inflammatory drug caused heart attacks and strokes (Specter 2009) and that environmental scientists have conspired to keep refereed journals from publishing papers by researchers skeptical that global warming is a crisis (Hayward 2009; Revkin 2009). There are many theories about physicians or drug companies conspiring to suppress non-mainstream medical treatments, vitamins, and health foods. One author alleges that big business and the medical establishment conspired to obstruct the search for a cure for AIDS so that they could sell their ineffective drugs and treatments (Nussbaum 1990).

Many of these theories are clearly absurd, but some are plausible and others actually contain elements of truth. How can we distinguish among the amusing eccentrics, the honestly misguided, the avaricious litigants, and the serious skeptics questioning a premature consensus? With scientific claims, the only definitive answer is to reexamine the original research data and repeat the experiments and analysis. But no one has the time or the expertise to examine the original research literature on every topic, let alone repeat the research. As such, it is important to have some guidelines for deciding which theories are plausible enough to merit serious examination.

One valuable guideline is to look for cascade logic in conspiracy arguments (Susstein and Vermeule 2008). This occurs when defenders of a conspiracy theory find it necessary to implicate more and more people whose failure to discover or reveal the conspiracy can be explained only by their alleged complicity. Another guideline is to look for exaggerated claims about the power of the conspirators, claims that are needed to explain how they were able to intimidate so many people and cover their tracks so well. The more vast and powerful the alleged conspiracy, the less likely that it could have remained undiscovered.

For example, the claim that the Moon landing in 1969 was a hoax implies the complicity of thousands of American scientists and technicians, as well as that of Soviet astronomers and others around the world who tracked the event. It is incredibly implausible that such a conspiracy could have held together. On the other hand, the theory that a few individuals in Richard Nixon's campaign conspired to break into their opponents' offices in the Watergate building was plausible and proved worth investigating. Similarly, the theory that a group of climate scientists conspired to suppress research that they believed to be misleading and harmful to public policy is plausible and worth investigating, despite the small likelihood that such a conspiracy would remain undetected for long.

Definition of ‘Conspiracy'

The conspiracy meme works because conspiracies do exist in the real world. Claims of conspiracy cannot be reflexively dismissed, but they are difficult to test because lack of evidence can be interpreted as proof of how cleverly the conspirators have hidden it. The first step in testing claims of conspiracy is to establish precisely what is being claimed. There is no single accepted definition of “conspiracy,” and people apply the term differently depending on their point of view. The Oxford English Dictionary defines a conspiracy quite loosely as “an agreement between two or more persons to do something criminal, illegal, or reprehensible.” There are legal definitions of criminal conspiracy, but whether something is “reprehensible” is in the eye of the beholder. When Hillary Clinton protested that her husband was the victim of a “vast right-wing conspiracy” and Lyndon Johnson accused the media and liberal activists of a “conspiracy” to oppose his Vietnam War policies, these claimants were intentionally vague as to whether they referred to illegal or merely reprehensible behavior (Kramer and Gavrieli 2005). Any group of people organizing for a cause the speaker does not like may be denounced as “conspirators.”

But the word conspiracy also usually implies something that is secret and hidden. Pigden (2006, 20) defines a conspiracy as “a secret plan on the part of a group to influence events in part by covert action.” Conspiracies so defined certainly do take place, and it may be that the most successful ones are never discovered. They include the (failed) conspiracy to assassinate Adolph Hitler; the September 11, 2001, terrorist attacks; and the Watergate conspiracy. But the term “conspiracy theory” usually refers to claims that important events have been caused by conspiracies that have heretofore remained undiscovered (Coady 2006). The claim that the World Trade Center was bombed by al-Qaeda would not be a conspiracy theory in this sense, but the claim that it was bombed by Israeli agents or that American authorities knew about it in advance would be. There is no chance of getting agreement on an “official” definition, but people alleging conspiracy should be challenged to be clear about their meaning.

The conspiracy meme flourishes best in politics, religion, and journalism, where practitioners can succeed by attracting followers from the general public. It isn't essential that practitioners actually believe the theory; they may just find it plausible and useful to raise doubts and discredit their competitors. But this strategy should not be enough for scientists. Scientific findings are just that-findings, not speculations about undiscovered goings-on. These findings must be replicable by other scientists.

In their routine work, scientists have little use for the conspiracy meme because success in scientific careers comes from winning grant applications and publishing significant findings in peer-reviewed journals. Attacking other scientists as conspirators would not be helpful for most scientists' careers, however frustrated they may be with referees, editors, colleagues, or administrators who turn down their manuscripts or grant proposals or deny them tenured jobs. But the conspiracy meme may be useful for scientists who are so far out of the mainstream in their field that they seek to appeal to alternative funding sources or publication outlets. The conspiracy meme also occasionally surfaces when a scientist's mental health deteriorates to the point that he or she loses touch with reality.

Trial lawyers, on the other hand, have a great deal of use for the conspiracy meme because they succeed by convincing juries. It is part of the standard repertoire of memes they use to discredit evidence offered by “experts” of all kinds, including scientists. Lawyers focus on the motivations of the experts, on who hired them, on what they are being paid for their testimony, and so on. They also seek out an “expert” who will testify on their side, implying that expertise is for sale to the highest bidder and that opinion is divided on the issue in question. The rewards can be very great if a class-action lawsuit results in a settlement against a wealthy corporation.

Vaccine Conspiracies

Conspiracy theories about vaccines were given a tremendous boost when the esteemed medical journal the Lancet published a study reporting a hypothesized link between the measles-mumps-rubella (MMR) vaccine and autism (Burgess et al. 2006). The media highlighted the story despite the study's very small sample size and speculative causal inferences, and the public reaction was much larger than medical and public health authorities anticipated. Reasons for the public reaction included resentment of pressure on parents, distrust of medical authorities, and the potentially catastrophic nature of possible risk to a vulnerable population. There was also the potential for large class-action settlements in favor of parents who believed their children were injured by the vaccines, some of whom desperately needed help to care for autistic children.

The result was a decline in the proportion of parents having their children vaccinated and a subsequent increase in disease, especially in the United Kingdom. The authorities responded by citing findings from large epidemiologic studies, but much of the press coverage highlighted anecdotal accounts and human-interest stories. Recovery of public confidence in vaccination may be due more to revelations of conflicts of interest on the part of the physician who published the original article-which was eventually withdrawn by the journal-than to the overwhelming evidence for the lack of a relationship between vaccination and autism rates.

Conspiracy theorists typically overlook lapses in logic and evidence by their supporters, but they are quick to pounce on any flaw on the part of their opponents. When a leading Danish vaccine researcher was accused of stealing funds from his university, the vaccine conspiracy theorists pounced. Robert F. Kennedy, Jr., son of a former U.S. Attorney General, used the occasion to denounce the “vaccine cover-up” on the influential blog Huffington Post (Kennedy 2010). He explained away the research findings on vaccines and autism on the grounds that there had been a change in the Danish law and the opening of a new autism clinic. He criticized vaccine researchers for receiving money from the Centers for Disease Control and Prevention (CDC) for their studies and for “being in cahoots with CDC officials intent on fraudulently cherry-picking facts to prove vaccine safety.” But if the CDC had not funded this research, largely in response to popular concerns, vaccine opponents would have denounced it for not doing so.

Genetically Modified Food Conspiracies

Public alarm about GM foods was aroused when a scientist, Árpád Pusztai, claimed in a television interview that rats had suffered intestinal damage due to eating GM potatoes (“Genetically modified” 2010; Enserink 1999). The finding was clearly preliminary; there were only six rats in each of two groups, and one group was fed GM potatoes for only ten days. The reported effects on the rats were minor, but the study received tremendous publicity because it fed into fears that had long been cultivated by environmentalist and anti- capitalist social movements. As the controversy progressed, questions were raised about the integrity of the study, leading Pusztai to leave his research institute. But anti-GM activists denounced criticisms of the research as a conspiracy and circulated a petition among scientists supporting Pusztai's rights. Finally, the Lancet published his study, which had not yet appeared in a refereed journal. They sent it to six reviewers, only one of whom opposed publication. But one of the reviewers who favored publication said he “deemed the study flawed but favored publication to avoid suspicions of a conspiracy against Pusztai and to give colleagues a chance to see the data for themselves” (Enserink 1999).

By releasing his findings on television, Pusztai received extraordinary attention for a study that otherwise might never have been accepted by a leading scientific journal. At least, that was the opinion of the editor of a competing journal who asked “when was the last time [the Lancet] published a rat study that was uninterpretable? This is really lowering the bar” (Enserink 1999). Releasing controversial findings on the Internet or through press releases is justified as a way of making important discoveries available quickly, but it also serves to circumvent the normal scientific review process. Sometimes these “findings,” such as the claim that the decline in crime in the United States in the 1990s was due to the legalization of abortion in the 1970s, become part of the conventional wisdom before other scientists have a chance to debunk them (Zimring 2006).

The Fair Debate Meme

Dissenters from mainstream science often invoke the meme that there are two sides to every question and each side is entitled to equal time to present its case. George W. Bush famously suggested that students be taught both evolution and creationism so that they can judge which has the most convincing argument. Similarly, holocaust deniers demand equal time for their side of the argument, and they might travel to Tehran or wherever they can find a receptive audience. If these dissenters, or “revisionists,” succeed in getting an opportunity to present their case, they will hammer away at any gaps or contradictions in the evidence presented by mainstream researchers, using rhetoric that questions their opponents' motivations while avoiding any hint of weakness or bias in their own case.

This advocacy meme is widely used in law courts and political debates, and it can work well when the question at hand is one of taste or morality. It doesn't work well for scientists because there are objectively right and wrong answers to most scientific questions-they can't be resolved by votes of schoolchildren. Schoolchildren in 1945 might have agreed with U.S. Admiral William Leahy's famous statement that “the [atomic] bomb will never go off, and I speak as an expert on explosives.” But once the bomb went off, there were no longer two sides to the question.

The Scientific Expertise Meme

In deciding to pursue the atomic bomb project, President Harry Truman relied on another meme that is very powerful in western societies, that of reliance on scientific expertise. Decision makers and the general public are most likely to be persuaded by this meme when scientists are in agreement and when their advice and policy prescriptions have a good track record. There is an inherent tension between the policy makers' desire for consensus and the scientists' need to remain open to alternative theories and evidence. Scientists who wish to influence policy may be tempted to claim a scientific consensus when the facts do not yet warrant one.

We social scientists have forfeited much of our potential influence because we are too often perceived as advocates for a cause rather than as objective researchers. Our ability to predict policy outcomes is very limited, yet we sometimes fall into the trap of claiming to know more than we do. Econometricians have been publishing conflicting analyses of the relationship between capital punishment and homicide rates for decades without making any real progress, yet they continue to use their findings to advocate for or against capital punishment (Goertzel and Goertzel 2008). When President Bill Clinton proposed welfare reform in the United States, social scientists specializing in the topic almost universally predicted that a disastrous increase in poverty and hunger would result. In some cases they defended their predictions with elaborate statistical models, despite the fact that these models had no demonstrated track record for predicting trends in poverty (Goertzel 1998). President Clinton deferred to politicians and conservative activists who predicted that poverty and dependency would decline as, in fact, they did.

Memes Collide: HIV/AIDS Deniers

The conflict between the fair debate meme and the scientific expertise meme was pronounced in the dispute between the late Nature editor John Maddox and biologist Peter Duesberg, who opposes the theory that HIV causes AIDS. Relying on the norms of fairness in debate, Duesberg (1995) sought the right to reply to scientific papers that defend mainstream views about the HIV-AIDS connection. At a certain point in the debate, Maddox refused to continue to give Duesberg “the right of reply,” arguing that Duesberg had “forfeited the right to expect answers by his rhetorical technique. Questions left unanswered for more than about ten minutes he takes as further proof that HIV is not the cause of AIDS. Evidence that contradicts his alternative drug hypothesis on the other hand is brushed aside.” Maddox argued that Duesberg was not asking legitimate scientific questions but rather making demands and implying, “Unless you can answer this, and right now, your belief that HIV causes AIDS is wrong” (Maddox 1993).

Maddox observed that “Duesberg will not be alone in protesting that this is merely a recipe for suppressing challenges to received wisdom. So it can be. But Nature will not so use it. Instead, what Duesberg continues to say about the causation of AIDS will be reported in the general interest. When he offers a text for publication that can be authenticated, it will if possible be published.” As an editor of a scientific journal, Maddox was justified in saying that he would publish papers that offered new findings, not ones that just picked at unanswered questions in other people's work. But Maddox was realistic in realizing that his refusal to publish additional comments by Duesberg would be portrayed as censorship by believers in the AIDS conspiracy theory.

The Resistance to Orthodoxy Meme

Duesberg and other dissenters also rely on another well-established rhetorical meme, that of the courageous independent scientist resisting orthodoxy. This meme is frequently introduced with the example of Galileo's defense of the heliocentric model of the solar system against the orthodoxy of the Catholic Church. And there are other cases of dissenting scientists who have later been proven right. Thomas Gold (1989) reports confronting the “herd mentality” of science when advancing his theories on the mechanisms of the inner ear and the nature of pulsars as rotating neutron stars, both of which later came to be accepted. This “herd mentality” is not the product of a deliberate conspiracy, although it may be perceived as one. It is a collective behavior phenomenon: a belief is reinforced and becomes part of the conventional wisdom because it is repeated so often. This is why those who offer differing views are important. Being a dissenter from orthodoxy isn't so difficult; the hard part is actually having a better theory than the conventional one. Dissenting theories should be published if they are backed by plausible evidence, but this does not mean giving critics “equal time” to dissent from every finding by a mainstream scientist.

In his response to Duesberg, Maddox refers to the philosophical argument, associated with Karl Popper, that science progresses through falsification of hypotheses. Maddox says, “True, good theories (pace Popper) are falsifiable theories, and a single falsification will bring a good theory crashing down.” But he goes on in the next sentence to implicitly rely on a different philosophy of science, often associated with the work of Imre Lakatos, which is that science normally progresses by correcting and adding to ongoing research programs, not by abandoning them every time a hypothesis fails. Maddox says, “Unanswered questions are not falsifications; rather, they should be the stimulants of further research.”

Scientists do change their ideas in response to new evidence, perhaps more often than people in most walks of life. Linus Pauling abandoned his triple-helix model of DNA as soon as he saw the evidence for the double-helix model. But he never abandoned his advocacy of vitamin C as a treatment for the common cold and cancer, no matter how many studies failed to show a significant difference between experimental and control groups. Pauling found flaws in each study's research design and insisted that the results would be different if only the study were done differently. He never did any empirical research on vitamin C himself, research that would have risked failing to confirm his hypotheses. He instead limited himself to debunking published scientific studies. Unfortunately, Pauling is probably better known by the general public for this work than for his undisputed and fundamental contributions to chemistry. Pauling did not claim that he was the victim of a conspiracy; he saw himself as challenging the herd mentality of science. But his scientific prestige lent credibility to those who sought to discredit scientific medicine as a conspiracy of doctors and drug companies (Goertzel and Goertzel 1995). Scientific expertise is usually quite specialized, and scientists who advocate for political causes only tangentially related to their area of specialization have no special claim on the truth.

Conspiracy theorists often seem to believe that they can prove a scientific theory wrong by finding even a minor flaw or gap in the evidence for it. Then they claim conspiracy when scientists endeavor to fix the flaw or fill the gap. If the scientists persist in their work, on the assumption that a solution will be found, they are again charged with conspiracy. In fact, the occasions when an entire scientific theory is overthrown by a negative finding are few and far between. This is especially true in fields depending on statistical modeling of complex phenomena for which there are often multiple models that are roughly equally good (or bad), and the choice of a data set and decisions about data-set filtering are often critical. The more important test of a research program is whether progress is being made over a period of time and whether better progress could be made with an alternative approach. Progress can be measured by the accumulation of a solid, verifiable body of knowledge with a very high probability of being correct (Franklin 2009).

Climate Change Conspiracy

The conspiracy meme has been especially prominent in the debate about global warming. When the Intergovernmental Panel on Climate Change published its report in 1996, an eminent retired physicist, Frederick Seitz (1996), accused it of a “major deception on global warming” on the op-ed pages of the Wall Street Journal. Seitz did not offer a scientific argument that the report's conclusions were wrong. Instead, he attacked the committee's procedure in editing its document, accusing the editors of violating their own rules by rewording and rearranging parts of the text to obscure the views of skeptical scientists. This seemingly obscure point about the editing of a UN technical document proved remarkably effective in providing a rallying point for opponents of the report's conclusions.

A careful review of the incident (Lahsen 1999) concluded that the editors did not violate any of their own rules and that the editorial changes were reasonable. Editors, after all, do edit texts, all the more so when the texts are written by a committee. The skeptical arguments were not deleted from the report, but they were repositioned and rephrased, perhaps giving them less emphasis than Seitz thought they deserved. But the conspiracy meme was successful in shifting much of the public debate from the substance of the issue to criticism of personalities, procedures, and motivations. The climate scientists felt attacked and apparently began to think of themselves more as activists under siege than as neutral scientists. In 2009, computer hackers released private e-mails apparently showing that some climate scientists had pressured editors not to publish papers by skeptics and that the climate scientists had looked for ways to present their data to reinforce their advocacy views (Revkin 2009; Hayward 2009; Broder 2010).

Climate science is heavily dependent on complex statistical models based on limited data, so it is not surprising that models based on different assumptions give differing results (Schmidt and Amman 2005). In presenting their data, some scientists were too quick to smooth trends into a “hockey stick” model that fit with their advocacy concerns. Several different groups of well-qualified specialists have now been over the data carefully, and the result is a less linear “hockey stick,” with a rise in temperature during a medieval warm period and a drop during a little ice age. But the sharp increase in warming in the twentieth century, which is the main point of the analysis, is still there (“Hockey stick controversy” 2010; Brumfiel 2006).

This is not the place to review the substance of the issue, which has already been debated extensively in this journal. An encouraging thing, however, is that despite the bitterness is the debate about scientists' behavior, there is considerable consensus on the issue of global warming itself. One of the responsible critics, for example, frankly states that “climate change is a genuine phenomenon, and there is a nontrivial risk of major consequences in the future” (Hayward 2009). But there is no consensus on how high the risk is, how quickly it is likely to materialize, or the costs and benefits of strategies needed to counter it.

The less responsible critics simply dismiss the issue as a hoax and focus exclusively on the peccadilloes of the other side. The climate scientists gave the conspiracy theorists an opening by letting their advocacy color their science, which compromised the legitimacy of their enterprise and, ironically, weakened the political movement itself. This is especially unfortunate because the underlying science is fundamentally correct.

Conspiracy Consequences

Faced with assaults on their professional credibility, scientists may be tempted to retreat from the world of public policy. But allowing the conspiracy theorists to dominate the public debate can have tragic consequences. Fear of science and belief in conspiracies has led British parents to expose their children to life-threatening diseases, the South African health department to reject retroviral treatment for AIDS, and the Zambian government to refuse GM food from the United States in the midst of a famine. Fear of science is not new. Benjamin Franklin was afraid to vaccinate his family against smallpox and regretted it deeply when a son died of the disease in 1736. Parents are making the same mistake today.

Advocacy groups sometimes find it easier to arouse fears of science than to advocate for other goals that may actually be more fundamental to their concerns. The movement against GM foodstuffs in Europe was mobilized largely by anti-capitalist, anti-corporate, and anti-American activists who found it more effective than attacking corporate capitalism directly (Purdue 2000; Schurman 2004). These ideologies have much less support in North America, and efforts to organize against GM food here were much weaker. North Americans have suffered no significant ill effects from the integration of these foods into their diet, a fact that Greenpeace and other advocacy groups studiously ignore. One suspects that if GM seeds had been invented by a socialist government, these advocacy groups would have heralded them as a great victory in the war against hunger.

Public policy requires reaching consensus to make decisions, even though some uncertainty usually remains. If scientists cannot do this, surely it is too much to expect politicians or journalists to do it. Efforts to define a consensus are vulnerable to attacks by conspiracy theorists who portray a consensus as a mechanism for suppressing dissent and debate. But there will always be dissenters, and at a certain point arguing with them becomes unproductive. In 1870, Alfred Russell Wallace allowed himself to be drawn into an extended conflict with Flat Earth theorist John Hampden, editor of the Truth-Seeker's Oracle and Scriptural Science Review. Their dispute over whether the Earth is round involved measuring the curvature of the water on the Old Bedford Canal in England. There was a public wager, which Wallace won, followed by a lawsuit when Hampden refused to pay, a threat against Wallace's life, and a prison term for Hampden. Hampden and his followers were never convinced the Earth is not flat, and belief in the “round Earth conspiracy” apparently persists to this day (Garwood 2008; O'Neill 2008).

Scientists will never reach a consensus with Flat Earthers or with those who believe the Earth was created in 4004 bce. Nor do they need to. The best that science can provide is a clearly specified degree of consensus among scientists who base their conclusions on empirical data. Efforts to reach consensus on important questions have been discouraged due to the influence of philosophers of science who emphasize conflicting research programs, paradigm shifts, and scientific revolutions (Franklin 2009; Stove 1982). Although these events do occur in the history of science, they are exceptional. Most sciences, most of the time, progress with an orderly, gradual accumulation of knowledge that is recognized and accepted by specialists in the field. Opposition rooted in religious or ideological concerns is acceptable as part of the democratic political process, but it need not prevent scientists from reaching a consensus when one is justified.

Peer Review

The peer review process in scientific journals plays a central role in determining which research findings deserve to be incorporated in the scientific consensus on an issue. As such, this process is a target for conspiracy theorists. Peer reviewers are usually anonymous, which suggests they may have something to hide. Although authors' names are usually removed from studies to be reviewed, reviewers are specialists in the same field and can often guess who the authors are. Reviewers are not in a good position to detect actual fraud; they can't redo the experiments or the data analysis. And they may reject papers that go against the conventional wisdom or political consensus in their field (Franklin 2009, 205–11). No adequate alternative to peer review has been proposed, but initiatives to make the review process more transparent may help, including making reviewers' comments and the original data sets available on the Internet.

The credibility of peer review has been undermined in the recent dispute over global warming because the reviewers are drawn from a fairly small pool of specialists who are known to have a policy agenda. The appointment of panels of distinguished scientists to review the body of research in the field is an excellent step to rebuilding credibility (Broder 2010). The review panels must have full access to all the data sets, as well as the time and expertise to conduct their own analyses if necessary-which cannot normally be expected of volunteer reviewers for a journal. It is important that these reviewers give qualified specialists an opportunity to present alternative views, as long as these views are based on scientific analysis of appropriate data and not just polemical criticism. No matter how well they do their work, however, these panels are likely to be attacked by conspiracy theorists.

If the blue-ribbon scientific commissions confirm the original research findings, perhaps with only modest caveats, many people will be convinced. But individuals with strong feelings about the issue may resort to cascade logic, suspecting that the review panel is also part of the conspiracy. Cascade logic can easily develop into a generalized distrust of anything that comes from a mainstream or elite source. In the past, social psychological studies found that this kind of generalized belief in conspiracies was most common among people who were discontented with the established institutions and elite groups in their society, believed that conditions were worsening for people like themselves, and believed that authorities did not care about them (Goertzel 1994; Kramer 1998).

The conspiracy meme can convert a dry scientific issue into a human drama in which malefactors can be exposed and denounced. Scientists are not trained in dealing with this kind of debate, and there is no reason to expect them to be especially good at it. If they also have strong feelings about the issues, they may fall into the conspiracy meme themselves. But when scientists succumb to the temptation to “fight fire with fire,” they risk losing their credibility as experts. It may be tempting to exaggerate findings in mass media outlets by using graphics that highlight the most extreme possibilities. This may be effective in the short run, but the public feels deceived when today's newest scare is refuted by tomorrow's press release; their belief in science is diminished. In today's political climate, scientists need to be careful about releasing their findings on controversial issues; they must make sure the findings have been thoroughly reviewed and that the data sets are available for others to analyze.

Political decisions will inevitably reflect economic interests and emotional concerns that conflict with what scientists believe is best. But scientists can be more effective if they avoid using the conspiracy meme and other rhetorical devices and instead clearly separate their scientific work from their political advocacy as citizens.

References

Avery, Dylan. 2009. Loose Change 9/11: An American Coup. Distributed by Microcinema International. Released September 22.

Broder, John. 2010. Scientists taking steps to defend work on climate. New York Times (March 2). Available online at http://www.nytimes.com/2010/03/03/science/earth/03climate.html

Brumfiel, Geoff. 2006. Academy affirms hockey-stick graph. Nature 441(7097) (June): 1032–33.

Burgess, David, Margaret Burgess, and Julie Leasak. 2006. The MMR vaccination and autism controversy in United Kingdom 1998–2005: Inevitable community outrage or a failure of risk communication? Vaccine 24(18): 3921–28.

Coady, David. 2006. Conspiracy theories and official stories. In Conspiracy Theories: The Philosophical Debate, edited by David Coady. Hampshire, UK: Ashgate.

Dawkins, Richard. 1976. The Selfish Gene. Oxford: Oxford University Press.

Duesberg, Peter. 1995. Infectious AIDS: Have We Been Misled? Berkeley, CA: North Atlantic.

Enserink, Martin. 1999. The Lancet scolded over Pusztai paper. Science 286 (October 22): 656.

Franklin, James. 2009. What Science Knows and How it Knows It. New York: Encounter.

Garwood, Christine. 2008. Flat Earth: History of an Infamous Idea. New York: Thomas Dunne.

“Genetically modified food conspiracies.” Wikipedia. Accessed March 15, 2010. Available online at http://en.wikipedia.org/wiki/Genetically_modified_food_controversies#cite_ note-Enserink1999B-64

Goertzel, Ted. 1994. Belief in conspiracy theories. Political Psychology 15: 731–42. Available online at http://crab.rutgers.edu/~goertzel/conspire.doc.

---. 1998. Why welfare research fails. Available online at http://crab.rutgers.edu/%7Egoertzel/fail2.html.

Goertzel, Ted, and Benjamin Goertzel. 1995. Linus Pauling: A Life in Science and Medicine. New York: Basic Books.

---. 2008. Capital punishment and homicide rates: Sociological realities and econometric distortions. Critical Sociology 34(2): 239–54.

Gold, Thomas. 1989. The inertia of scientific thought. Speculations in Science and Technology 12(4): 245–53. Available online at www.suppressedscience.net/inertiaofscientificthought.html.

Hayward, Steven. 2009. Scientists behaving badly. The Weekly Standard 15(13) (December 14). Available online at www.weeklystandard. com/Content/Public/Art cles/000/000/017/300ubchn.asp.

“Hockey stick controversy.” Wikipedia. Accessed February 27, 2010. Available online at en.wikipedia.org/wiki/Hockey_stick_controversy.

Kaufman, Leslie. 2010. Darwin foes add warming to targets. New York Times (March 3). Available online at www.nytimes.com/2010/03/04/science/earth/04climate.html.

Keeley, Brian. 2006. Nobody expects the Spanish inquisition! More thoughts on conspiracy theory. In Conspiracy Theories: The Philosophical Debate, edited by David Coady. Hampshire, UK: Ashgate.

Kennedy, Robert F. 2010. Central figure in CDC vaccine cover-up absconds with $2m. Huffington Post (March 11). Available online at www.huffingtonpost.com/robert-f-kennedy-jr/central-figure-in-cdc-vac_b_494303.html.

Kramer, Roderick. 1998. Paranoid cognition in social systems. Personality and Social Psychology Review 2(4): 251–75.

Kramer, Roderick, and Dana Gavrieli. 2005. The perception of conspiracy: Leader paranoia as adaptive cognition. In The Psychology of Leadership, edited by D. Messick and R. Kramer, 251–61. Mahway, NJ: Lawrence Erlbaum Associates.

Lahsen, Myanna. 1999. The detection and attribution of conspiracies: The controversy over Chapter 8. In Paranoias within Reason: A Casebook on Conspiracy as Explanation, edited by G. Marcus. Chicago: University of Chicago Press, 111–36.

Maddox, John. 1993. Has Duesberg a right of reply? Nature 363: 109. Available online at www.virusmyth.com/aids/hiv/jmrightreply.htm.

McConnachie, James, and Robin Tudge. 2008. The Rough Guide to Conspiracy Theories. London: Penguin.

Nussbaum, Bruce. 1990. Good Intentions: How Big Business and the Medical Establishment are Corrupting the Fight Against AIDS. New York: Atlantic Monthly Press.

O'Neill, Brendan. 2008. Do they really think the Earth is flat? BBC News Magazine (August 4). Available online at http://news.bbc.co.uk/2/hi/uk_news/magazine/7540427.stm.

Pigden, Charles. 2006. Popper revisited, or what is wrong with conspiracy theories? In Conspiracy Theories: The Philosophical Debate, edited by David Coady. Hampshire, UK: Ashgate.

Purdue, Derrick. 2000. Anti-GenetiX: The Emergence of the Anti-GM Movement. Surrey, UK: Ashgate.

Revkin, Andrew. 2009. Hacked e-mail data prompts calls for changes in climate research. New York Times (November 27).

Schmidt, Gavin, and Caspar Amman. 2005. Dummies guide to the latest “hockey stick” controversy. Available online at http://www.realclimate.org/index.php/archives/2005/02/dummies-guide-to-the-latest-hockey-stick-controversy.

Schurman, Rachel. 2004. Fighting “Frankenfoods”: Industry opportunity structures and the efficacy of the anti-biotech movement in western Europe. Social Problems 51(2): 243–68.

Seitz, Frederick. 1996. Major deception on global warming. Wall Street Journal (June 12). Available online at www.sepp.org/Archive/controv/ipcccont/Item05.htm

Slensky, Michael. 2006. The loose cannon of 9/11. AlterNet.org (August 21). Available online at www.alternet.org/story/40476.

Specter, Michael. 2009. Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet, and Threatens Our Lives. New York: Penguin.

Stove, David. 1982. Popper and After: Four Modern Irrationalists. Oxford: Oxford University Press.

Susstein, Carl, and Adrian Vermeule. 2008. Conspiracy theories. Available online at papers.ssrn.com/sol3/papers.cfm?abstract_id=1084585.

Wallace, Amy. 2009. An epidemic of fear. Wired 17(10): 128–36.

Zimring, Franklin. 2006. The Great American Crime Decline. New York: Oxford University Press.

Ted Goertzel

Ted Goertzel is a professor of sociology, Sociology Department, Rutgers University, Camden, NJ 08102. E-mail: goertzel@camden.rutgers.edu.