Science con-artists.
Subject: Biological diversity (Analysis)
Predation (Biology) (Analysis)
Author: Allchin, Douglas
Pub Date: 11/01/2012
Publication: Name: The American Biology Teacher Publisher: National Association of Biology Teachers Audience: Academic; Professional Format: Magazine/Journal Subject: Biological sciences; Education Copyright: COPYRIGHT 2012 National Association of Biology Teachers ISSN: 0002-7685
Issue: Date: Nov-Dec, 2012 Source Volume: 74 Source Issue: 9
Geographic: Geographic Scope: United States Geographic Code: 1USA United States
Accession Number: 308743630
Full Text: Deception abounds in nature. Some species are first-rate con-artists. Angler fish with fins that mimic squiggling morsels that lure unsuspecting prey. Carnivorous pitcher plants that emit the aroma of rotting flesh and attract flies to their doom. Orchids that resemble female wasps, decoys for male wasp pollinators. Cuttlefish whose color and pattern morph with the substrate as a disguise against predators. Deceptive patterns, smells or sounds in organisms wonderfully reflect the adaptive response to opportunity.

So, too, in human culture? Human behavior can take advantage of cultural conditions and deceive others to promote one's own interests. So, if science receives cultural authority, it should surprise no one that those seeking power or profit might try to mimic it. Indeed, the more authority we give to science, the greater the likelihood of science imposters--and the more sophisticated their deceptive tactics. Cultural anthropologist Chris Toumey likens the process to a magician's illusions. Imitators "conjure" science, he says, "from cheap symbols and ersatz images" (1977, p. 6). It is an apt and vivid label. We could just as easily call them science con-artists. Liars. Cheats. Seeking our confidence using a semblance of science. Predictable opportunists, perhaps.

Science educators generally want to inform students so that as citizens and consumers of science they do not succumb to such wiles. The posture--too easily taken for granted, I think--is that simple knowledge of the scientific method or evaluating scientific evidence will suffice. From teaching about what defines science and what pseudoscience, students supposedly learn to debunk the charades. Here, I invite you to question this sacred bovine. A sampling of recent historical cases will indicate, I hope, that the science con-artists in modern society are more significant than commonly assumed. Science and what counts as science can diverge sharply (Sacred Bovines, April 2012). Student skills in sorting experts from non-experts may help (Sacred Bovines, May 2012). But con-artists use various psychological strategems to gain trust without expertise. The informed citizen and consumer needs to understand them. Call it an escalation in the evolutionary response of prey to predator. Ultimately, appreciating these practical challenges may highlight deficits in current curricula and prompt us to include more lessons in science communication, in addition to those for understanding the nature of science itself.

Consider, for example, the case of German entrepreneur Matthias Rath (Goldacre, 2010, pp. 131-146). Rath's business was selling vitamin pills. In the United Kingdom, he promoted them as a cure to cancer, running newspaper ads criticizing chemotherapy and other treatments. Then he went to a welcome environment in South Africa. In 1999, the government there officially denied that the HIV virus caused AIDS and denounced antiretroviral drugs as harmful. Rath ran ads there, too, this time promoting vitamins as a cure for AIDS. The ads described genuine research that showed that vitamins mildly benefited HIV-positive patients. But then they grossly overstated the conclusions, claiming also that other treatments were ineffective and that vitamins could remedy AIDS outright. Rath also paid for ads in the New York Times and the Herald Tribune, which he later referred to as favorable news coverage. Credible voices criticized Rath. But his campaign survived them for almost a decade. It is hard to know how many thousands of people died or suffered as a result. We can only hope to learn how he was able to publicly trump scientific consensus, so that we might counteract future such deceptions.

Similar problems plague public understanding of global warming and climate change. In 1989, the George Marshall Institute, a conservative ideology center, began to confuse public perceptions of the emerging consensus by a new Intergovernmental Panel on Climate Change. They cherry-picked evidence and presented their findings outside the scientific literature, but their pronouncements were treated as sound science (Oreskes & Conway, 2010). No one may be surprised, either, that oil giant ExxonMobil distributed over $16 million in the early 2000s to 40 different organizations that challenged global warming (Mooney, 2005; Union of Concerned Scientists, 2007). One begins to see global warming "dissent" as a well-financed advocacy campaign, not related at all to science or uncertainty of evidence.

Not everyone who claims to be a scientist is a qualified expert. Not everyone who presents "scientific" evidence is honest about that evidence. How, then, does the consumer-citizen separate the wheat from the chaff? As these two initial cases might indicate, the first challenge is knowing a source's motives (Goldman, 2001; McGarity & Wagner, 2008; Michaels, 2008; Oreskes & Conway, 2010). What is someone's interests in advancing this claim? Will a lie profit them financially? Will it leverage them more political power? That is, students need to understand first how science communication may be shaped by persuasive interests.

The concept of social deception is not foreign to most people. We all have fairly good "BS alarms," I think, in familiar social settings. And we know to be on guard when the speaker is suspect. Much hinges on that initial judgment of trust.

Of course, science con-artists know this. They are thus ready to conceal their interests. They will hide contexts, such as sources of funding or political affiliations, that may threaten their credibility. Indeed, they will aim to actively dampen the sensitivity of our skeptical antennae, and our corresponding debunking abilities. But forewarned is forearmed. So one can be prepared. Several common techniques, described below, are especially worth knowing.

* Tactic 1: Style

The first aim of confidence artists is, of course, to develop confidence. They do this in part by embodying a confident aura. They look the part. They're smooth talkers. You feel comfortable. This judgment is made immediately and emotionally, without any conscious intent. Indeed, it requires effort to monitor this first impression and duly check the presenter's credentials. Charisma, smiles, deep assured voices, colorful prose, snappy sound bites, and "glittering generalities": all may set us at ease and prime us to extend trust. Andrew Wakefield, the doctor whose flawed research ignited the recent vaccination-autism scare, just comes across as such a nice guy. The ideally informed consumer acknowledges the invisible power of these psychological pitfalls, and remembers to methodically "cross-examine" their emotions later (Rampton & Stauber, 2001, pp. 291-294; Freedman, 2010; Kahneman, 2010).

Style comes in different forms. One subtle feature can be the "professional" quality of publications and media presentations. For example, the enduring battle between biologists and creationists entered a new phase in 2000 with the emergence of the book Icons of Evolution. First, the author's education had been funded by the neo-conservative Discovery Institute, home to the political campaign known as "Intelligent Design." With a higher degree in biology, he at least presented the superficial semblance of a scientific credential. Far more important, the volume was slick. Excellent production values. It looked like a professional academic book. And people do, alas, judge books by their covers. So people could think that the content must therefore be credible. But it was just plain old creationist rhetoric, with all the usual complaints, innuendo, and omissions. Then came The Atlas of Creation in 2006 (Yahya, 2006). Filled with gorgeous large-format glossy photos of fossils. Fine color printing on high-quality paper to enhance the vivid colors and clarity. Designed to impress. And it did. But it was also filled with creationist tripe. Just like the Creation Museum in Petersburg, Kentucky, that strives to look like a professional natural history museum. Only with creationist exhibit captions. These projects presaged the 2009 film, Darwin: The Voyage That Shook the World. It had all the appearances of a Public Broadcasting System or History Channel documentary. Yet the prominent historians who were interviewed were deceived and their views not honestly presented in the final edited film segments. Creationists were falsely presented as scientific experts, filmed in the same style. But who would have the time or resources to check all that? It looks good, so one assumes that the filmmakers must have been professional in their research, too. They were not. It was a Creation Ministries International scam, borrowing on Darwin's fame to try to erode it. Now there is a new series of videos flogging "Intelligent Design" from Illustra Media: The Privileged Planet, Metamorphosis, and Darwin's Dilemma. More subterfuge from the well-financed anti-evolution Discovery Institute.

The same applies now to many websites. For instance, Energy Answered.org describes itself as "intended to promote fact-based discussion about energy" (American Petroleum Institute, 2012, "About"). It looks very professional. Well organized. Clean graphics. And maps. And video clips. No confusing "ads" in the headers or margins. Yet it is funded by the petroleum industry. A bit of careful review will reveal its selective bias. Likewise for co2science.com, a front for industry propaganda on global warming. And "CleanAirProgress.org", funded by the petroleum and trucking industries, which has closed down since being exposed as a front group (Rampton & Stauber, 2001; Center for Media and Democracy, 2012). Commercial interests permeate these websites and others. But their persuasive power requires that the visitor not know or suspect this. More well-funded disinformation--here, from anti-environmental science con-artists.

Students are generally already well aware of the powerful social role of style. They just tend to see it function more through jeans, sunglasses, hairstyles, brand-name fashions, cell-phone apps, etc. They often have no stake (yet) in science communication. Still, their everyday experience offers a fruitful platform for analogy. Style encodes persuasive psychological messages, particularly about who is "in" and can be trusted.

* Tactic 2: Disguise

Because most people understand, at least informally, the dangers of biased messages, science con-artists try to hide their interests or associations. Validation in science has typically been marked by publication in a peer-reviewed journal. So that is their aim. But every symbol of science, it seems, can be corrupted. So industries have fashioned ways to publish their views--without exposing themselves to the very scrutiny that makes such publication meaningful.

Some industry associations create their own journals. They have impressive names, such as the Regulatory Toxicology and Pharmacology, Science Fortnightly, Journal of Physicians and Surgeons, Indoor and Built Environment, and Tobacco and Health. The list goes on (Michaels, 2008, pp. 53-55; Oreskes & Conway, 2010, pp. 244-245). But these ersatz journals lack rigorous peer review. They provide only the appearance of scientific rigor.

At other times, industries seek "credible" publications through credible authors. Having completed a biased study themselves, they enlist--for "due" compensation, of course--a medical researcher or university academic to serve as the author. It's called ghostwriting. It's like plagiarism, in reverse. And it is an industry unto itself now. You can hire a ghostwriting company to serve your needs (McGarity & Wagner, 2008, pp. 76-79; Rampton & Stauber, 2001, pp. 200-201). Many journals are responding by requiring authors to disclose conflicts of interest. But authors can lie, and there is little way to enforce honesty. According to a 2003 study, perhaps 1 in 15 medical researchers disclosed potential conflicts of interest (Freedman, 2010, p. 66). So: neither publication itself, not the credentials of the lead author, by themselves, can guarantee trustworthy science. Nor are conflicts of interest typically reported in the press (Cook et al., 2008). The savvy consumer must always mindfully monitor the potential for conflict of interest.

One of the greatest ironies in recent science "con-artistry" is the emergence of individuals who purport to debunk "junk science" even as they promote commercial and political interests. They pretend to champion good science. Here, their primary goal is typically not to gain scientific status for some ill-founded claim, but to erode confidence in sound science. They challenge findings about harmful chemicals or environmental dangers. All under the rubric of defending scientific rigor. Since no proof is absolute, it's easy to find and exaggerate holes in any study.

Steve Milloy's Junk Science Judo (2001), for example, has all the trappings of fun lessons for high school teachers to use in the classroom. But his targets are selective, reflecting an anti-regulation agenda (embodied by the ultra-libertarian Cato Institute, where he works). He defends DDT and junk food in schools, and tries to discredit the EPA and climate change research. All in the name of "good" science --but note: not balanced or fully informed. It is a fascinating, albeit disturbing masquerade.

The authors of It Ain't Necessarily So also pretend to embody a classic skeptical attitude, emblematic of science. But they target only studies that support prudent caution on workplace safety or environmental impact. "It's wise to be somewhat skeptical," they say But note what follows: "both about fairy tales and about risk narratives" (Murray et al., 2001, p. 131). Hm, why just risk narratives? Irony upon ironies, they deny the significance of conflict of interest:

It certainly sounds simple and appealing. Ideally, yes, conflicts of interest and science cons should not exist. Perhaps these authors fear that you will learn their own political affiliations. They certainly do not bother to profile how they selectively apply their principles of "the failings of journalism, the perversions of policy, and the weaknesses of science" (p. 193).

Knowing who is an expert and who is a bogus "expert" with a conflict of interest matters very much indeed (Sacred Bovines, May 2012). Disguise is a form of lying. Where trustworthy information is important, dishonesty always matters.

* Tactic 3: Exploiting Social Emotions

The manipulative techniques of advertising and public relations are well known. So many are familiar with the role of emotions in persuasion. Products (or ideas) can be associated with pleasant experiences or combined with clandestine sexual imagery. But some of the most dramatic cases of public dismissal of scientific consensus involve another set of emotions, involving sympathy, social cohesion, and loyalty to the "in-group." Here, judgments of trust and credibility are shaped by social relations and emotions related to a sense of "belonging."

For example, Rath's criticisms of AIDS in South Africa (above) were surely more effective due to lingering anti-Colonialist sentiments. Modern anti-HIV drugs seemed to represent the efforts of outsiders to dominate (or destroy). The Public Health Minister, Dr. Tshabalala-Msimang, could easily convince others that local customs in nutrition, embedded in African culture and history, could effectively combat an "alien" disease (Goldacre, 2009). Somehow, the science had been upstaged by social identity and cohesion.

Similarly, opposition to the fluoridation of public water in the 1950s and 60s was shaped less by the scientific evidence than by fears of government intrusion. Personal autonomy (or, in the midst of the Cold War, fear of totalitarianism) seemed prior to addressing any health effects from the fluoride. People rallied together to defend themselves. Identity with that group often seemed to dictate how individuals would subsequently choose which scientific evidence they would trust (Martin, 1984; Toumey, 1997, pp. 63-80). Social alliance was the basis for judging scientific reports.

In 1986, charismatic leader Lyndon LaRouche managed to tap into fears of AIDS and persuade many people that it was contagious through proximity and casual contact. Uniting people xenophobically under a perceived shared threat (disease, homosexuality), he persuaded over 2 million Californians to vote for mandatory HIV testing and quarantine. Again, the social dimension trumped the trustworthy science (Toumey, 1997, pp. 81-95).

More recently, we have seen concern about autism misdirected at vaccines as a probable cause. There was only ever one, quite insubstantial scientific study that ever supported this, and it has since been retracted. Yet the anxiety in a consolidated core group of parents triggered widespread cultural distrust of the measles vaccine (and others). Vaccination rates in Britain dropped so low that public health officials worried about a significant measles epidemic. One 13-year-old boy died--the first death there from measles in 14 years. Even now, shared antivaccine sentiment unites some social networks so strongly that anyone who presents contrary scientific information is thereby prejudged as a likely apologist for the pharmaceutical industry. Again, social connections have proved a basis for trust in science.

Biology teachers may well appreciate the neurohormonal dimension of this behavior. Recent research has shown, for example, that oxytocin released from the hypothalamus promotes empathy, trust, and generosity (Millar, 2010). At least within a group. At the very same time, it also promotes out-group antagonism and distrust (De Dreu et al., 2010). Trust and in-group sociality seem to be closely related neurophysiologically.

Science con-artists can exploit and benefit from these sociopsychological tendencies. First, they often present themselves as "just plain folk"--a strategy to become accepted socially as one of the group. The nature of an implied group can be further generated or amplified through eliciting external fears, name-calling, or rhetorical venom (Rampton & Stauber, pp. 251, 291-294). Alleged conspiracies tend to evoke social consolidation and, with it, trust. A cautious deliberative response by the consumer-citizen, on the other hand, will be met with the con-artist's statements of the urgency of the situation. There will be discussion of government cover-ups and efforts to suppress the "real" scientific evidence. The allegations effectively divert attention from the scientific literature or discourse of experts.

The social dimension of trust, then, proves relevant in the cultural transfer of scientific information. It helps remind us, perhaps, that chains of trust are just as important as skepticism. One needs to know not how to doubt, but precisely where to place trust--or with whom.

* Tactic 4: Conjuring Doubt

One especially notable con-artist strategy has been adopted with increasing frequency in the past several decades: conjuring public doubt amid scientific consensus (McGarity & Wagner, 2008, pp. 146-149; Michaels, 2008; Oreskes & Conway, 2010). This tactic has proved quite effective for mitigating or delaying policy or regulatory action where scientists have documented harmful practices or potential harms. The con exploits the popular notion that science is, or should be, certain. Thus, if there is any notable dissent whatsoever, the science will seem uncertain. Until science is certain, or "truly scientific," so the rhetoric goes, we should not rush to judgment. A cautious person waits "prudently" for all the evidence, "just in case." It sounds so reasonable. Unless one knows that the science is, rather, relatively definitive--and someone is profiting while others are knowingly harmed.

The con tactic, then, is to foster a public image of uncertainty, even where most experts agree or the preponderance of evidence is clear. Psychologically, it seems, "an ounce of uncertainty is worth a pound of doubt." Doubt, in turn, can further be reframed rhetorically as "probably wrong." This tactic was pioneered in the late 1960s as the tobacco industry fostered doubt about research on the adverse health effects of smoking. One 1969 industry document referred explicitly to the campaign's intent:

Doubt is our product since it is the best means of competing with the "body of fact" that exists in the minds of the general public. It is also the means of establishing a controversy. (Michaels, 2008, pp. x, 11)

Since that time, the strategy has been deployed repeatedly, in many other cases (Table 1).

In all these cases, researchers had determined a measurable harm. At the same time, industries tried to persuade others that the evidence was not definitive. Sometimes they used public relations firms to help shape what counted as science in the public realm: for example, Hill & Knowlton, Exponent, Inc., the Weinberg Group, or ChemRisk. I sometimes puzzle about the people who work on these projects: what science were they taught in school? In what context?

Conveying the expert consensus on human-caused global warming and climate change to the general public is, of course, another major case which still seems to haunt us, at least based on official statements by some major political candidates this year. This is a prime example of the ongoing significance of conjuring doubt.

How do the con-artists conjure doubt? They may question whether animal models are representative of human harm. One can always question this, whether one is justified or not. They challenge, too, the representativeness of human samples. They may flat-out challenge data as unreliable. They may enlist statisticians to reanalyze published data using modified parameters, in order to reduce the statistical significance. They often find and highlight single exceptions, discounting the overall balance of evidence. They emphasize extraneous causes and possible confounders--vague sources of errors that can be imagined as possible without having to document them as actually relevant. If the aim is an image of uncertainty, one does not need to win an argument. Or even justify it fully. One just needs to provoke sufficient skepticism.

In general, the conjurers of doubt try to prompt others to second-guess the experts. They portray flaws in the consensus as so simple and obvious that even unschooled non-scientists could easily detect them. Con-artists capitalize on an individual's sense of autonomy, that they can evaluate all the evidence on their own. If an ordinary person can understand a shred of counter-evidence, and the experts have not heeded it, then the experts must apparently not be so expert. This is how one begins, as David Michaels noted, to obscure sound science and replace it with what merely "sounds like science" (Michaels, 2008, pp. ix, xi).

* Tactic 5: Flooding the Media

When all else fails, one can merely generate a public impression of science through "advertising" (Rampton & Stauber, 2001, p. 294; McGarity & Wagner, 2008, pp. 204-228). The information need not be complete. Or responsible. For example, prominent global-warming critic Fred Singer arranged to co-author a paper with a climate-change expert, a rather reluctant Roger Revelle. Although some key phrases misrepresented Revelle's views, they were published after he died. As one measure of Singer's intent, the paper appeared in a "showcase" journal of an elite social club in Washington, D.C. The misleading claims then reappeared in numerous conservative editorials and public remarks (and even in a nationally televised vice-presidential debate), all positioned to diminish the impact of Revelle's cautionary claims elsewhere (Oreskes & Conway, 2010, pp. 190-197). Science does not have its own centralized institutional voice. One thus needs to be concerned about who presents the scientific "message," where and how.

Psychologist Daniel Kahneman (2010) notes that our minds have certain blind spots. One is that we typically base our judgements on what we have heard and seen, without a care about possibly relevant information we have not yet encountered. "What you see is all there is," he says. Accordingly, we tend to endorse whatever is familiar, whether fully informed or not. Without active reflective analysis, we may easily believe the preponderance of public claims.

Those who can underwrite "public awareness" can thereby influence what counts as science in the public realm. It may be through mass mailing. Or booths at state fairs. Or radio talk shows. Or websites and blogs and tweets. Or comments by political candidates. World Climate Review, a contrarian newsletter funded by the fossil fuel industry, is distributed free to members of the Society of Environmental Journalism (Oreskes & Conway, 2010, p. 203). The Heartland Institute, a well-financed advocacy center, sponsors a network of individuals to write letters to the editors of major U.S. newspapers, challenging the science that does not match their conservative agenda (http://heartland.org/ publications). More ghostwriting. There is no formal system of accountability or checks and balances in public science communication, as in science. So Fred Singer could make further misrepresentations about the 1995 IPCC report in an opinion piece in the Wall Street Journal. The editors were entitled to disregard most of the letters from outraged scientists. And others were free to (and did) cite Singer's essay (Oreskes & Conway, 2010, pp. 208-211). For many citizens, what you read is all there is. Flooding the media is just another science-con tactic.

* Counteradaptation

These 5 tactics for advocacy and fostering trust (and one might surely enumerate more) are hardly limited to science communication. But the context of science is special. By eclipsing the relevance of credible evidence, these various forms of persuasion can misinform public discourse and decision making.

Dealing with the tactics falls outside the standard textbook "scientific method." They thus may seem irrelevant to science proper. But knowing about them seems essential for anyone trying to assess scientific claims in a practical, cultural setting. The average person cannot find and judge every bit of evidence on their own. One depends on others with expertise. But that leaves one vulnerable to deception. The best protection is to know how to diagnose, detect, and thus also deflect the deception. And then find some real experts (Sacred Bovines, May 2012).

Some people worry about fraud or dishonesty in science. I worry instead about fraud, deception, and misplaced trust in science communication, beyond the scientific community. As documented above, efforts by non-scientists to mislead others about scientific consensus are widespread and have concrete consequences for the environment and human health.

Some people worry about a conflict between science and religion. I worry instead about the conflicts between science and power and between science and the blind drive for profit. These are the forces in modern society most likely to corrupt good science and science communication.

Some people worry about pseudoscience and ill-informed views about the nature of scientific evidence. I worry instead about how what generally counts as science in our culture diverges from the actual science. And about the wide range of tactics for shaping science communication and fostering trust in hollow "scientific" claims.

In short, it seems, our modern cultural condition warrants substantially more attention to science con-artists. And more reflection by science teachers on the corresponding educational challenges. Recall again the claim by conservative science "critics" Murray et al. (2001, p. 159): "It makes much more sense to look at what the researcher's methodology is, not where the money is coming from. The message, not the messenger, is what demands analysis." That is precisely what the science con-artists hope everyone will believe. Messengers can deceive. They can be confidence artists. One needs to assess the messenger before heeding any message. Will science teachers help thwart their deceptions by guiding students in developing appropriate counter-tactical skills?

References

Allchin, D. (2012a). What counts as science. American Biology Teacher, 74, 291-294.

Allchin, D. (2012b). Skepticism and the architecture of trust. American Biology Teacher, 74, 358-362.

American Petroleum Institute. (2012). Energy Answered. [Online.] Available at http://energyanswered.org.

Center for Media & Democracy. (2012). Foundation for Clean Air Progress. [Online.] Available at http://www.sourcewatch.org/index. php?title=Foundation_for_Clean_Air_Progress.

Cook, D.M., Boyd, E.A., Grossmann, C. & Bero, LA. (2007). Reporting science and conflicts of interest in the lay press. PLoS ONE, 2(12), e1266.

De Dreu, C.K.W., Greer, L.L., Handgraaf, M.J.J., Shalvi, S., Van Kleef, G.A., Baas, M. & others. (2010). The neuropeptide oxytocin regulates parochial altruism in intergroup conflict among humans. Science, 328, 1408-1411.

Freedman, D.H. (2010). Wrong: Why Experts* Keep Failing Us--And How to Know When Not to Trust Them. New York, NY: Little, Brown.

Goldacre, B. (2010). Bad Science: Quacks, Hacks, and Big Pharma Flacks. New York, NY: Faber and Faber.

Goldman, A.I. (2001). Experts: which ones should you trust? Philosophy and Phenomenological Research, 63, 85-110.

Kahneman, D. (2011). Thinking, Fast and Slow. New York, NY: Farrar, Straus & Giroux.

McGarity, T.O. & Wagner, W.E. (2008). Bending Science: How Special Interests Corrupt Public Health Research. Cambridge, MA: Harvard University Press.

Michaels, D. (2008). Doubt Is Their Product: How Industry's Assault on Science Threatens Your Health. Oxford, U.K.: Oxford University Press.

Miller, G. (2010). The prickly side of oxytocin. Science, 328, 1343.

Milloy, S.J. (2001). Junk Science Judo: Self-Defense against Health Scares and Scams. Washington, D.C.: Cato Institute.

Mooney, C. (2005). Some like it hot. Mother Jones, 30(3), 36-94.

Murray, D., Schwartz, J. & Lichter, S.R. (2001). It Ain't Necessarily So: How Media Make and Unmake the Scientific Picture of Reality. La n ham, MD: Rowman & Littlefield.

Oreskes, N. & Conway, E.M. (2010). Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury Press.

Rampton, S. & Stauber, J. (2002). Trust Us, We're Experts: How Industry Manipulates Science and Gambles with Your Future. New York, NY: Tarcher/ Penguin.

Toumey, C.P. (1997). Conjuring Science. New Brunswick, NJ: Rutgers University Press.

Union of Concerned Scientists. (2007). Smoke, Mirrors & Hot Air: How ExxonMobil Uses Big Tobacco's Tactics to Manufacture Uncertainty on Climate Science. Cambridge, MA: Union of Concerned Scientists.

Yahya, H.[Oktar,A.] (2006). The Atlas of Creation, Vol 1. Istanbul: Global Publishing. Also available online at http://harunyahya.com/ajax/ downloadLinks/work/4066.

DOUGLAS ALLCHIN, DEPARTMENT EDITOR

DOUGLAS ALLCHIN has taught both high school and college biology and now teaches history and philosophy of science at the University of Minnesota, Minneapolis, MN 55455; e-mail: allchin@sacredbovines.net. He is a Fellow at the Minnesota Center for the Philosophy of Science and edits the SHIPS Resource Center (ships.umn.edu). he hikes, photographs lichen, and enjoys tea.
It makes much more sense to look at what
   the researcher's methodology is, not where
   the money is coming from. The message,
   not the messenger, is what demands analysis
   (p. 159).


Table 1. Some cases of conjuring doubt in scientific
consensus (McGarity & Wagner, 2008; Michaels,
2008; Oreskes & Conway, 2010).

Second-hand smoke
Acid rain
Chlorinated fluorocarbons (CFCs) and the ozone layer
DDT use outside the U.S.
Formaldehyde
Hexavalent chromium
Vinyl chloride
Lead
Ephedra
Global warming and climate change
Gale Copyright: Copyright 2012 Gale, Cengage Learning. All rights reserved.