First among errors: if we could teach only one science lesson, what would it be?
Subject: Sciences education (Management)
Evolution (Educational aspects)
Author: Allchin, Douglas
Pub Date: 08/01/2010
Publication: Name: The American Biology Teacher Publisher: National Association of Biology Teachers Audience: Academic; Professional Format: Magazine/Journal Subject: Biological sciences; Education Copyright: COPYRIGHT 2010 National Association of Biology Teachers ISSN: 0002-7685
Issue: Date: August, 2010 Source Volume: 72 Source Issue: 6
Topic: Event Code: 200 Management dynamics Computer Subject: Company business management
Geographic: Geographic Scope: United States Geographic Code: 1USA United States
Accession Number: 245037774
Full Text: Imagine you are stranded on a deserted isle and you can take only one science lesson with you: what would it be? It's a variant of a familiar game. Pointlessly unrealistic, of course. No matter. As with many thought experiments, the purpose is more deeply philosophical. Namely, the question invites reflection--not about favorite books or music or interesting people--but about what, ultimately, is the most important element in science education.

Yes, really. Take a moment to reflect.

OK: evolution, hands down. That would be the answer--if what mattered was content. "Nothing in biology makes sense except in the light of evolution."

Yet others may surely contend that the core of science is not the content, but rather the process. "Give a student a concept and they can learn for a day; teach a student how to investigate, and they can learn for a lifetime." Teaching the process of science seems so much more fundamental and enabling.

What a potent conclusion. Imagine what this priority would imply about state-wide multiple-choice exams! What havoc! Yet I would wager that most science teachers would feel quite liberated if teaching process of science was their primary charge from the public. One could stop rushing through the textbook and cramming lectures with facts that students could find equally well on the Internet, given a bit of savvy "how-to" and critical thinking, so fundamental to effective research itself. One could focus on scientists themselves, their compelling stories, the route to discovery, the celebration of creative insight, the processes of reasoning: that is a science lesson that is both satisfyingly human and concretely useful.

So: process of science? Hardly an original answer, but surely provocative enough to start us pondering why this is not more central or dominant in state standards or the tests derived from them. Perhaps teachers and educational researchers need to reflect more thoroughly on how one demonstrates this form of understanding, so that it is not so easily shunted to the periphery when administrators and political demagogues scream "Accountability."

But with only a single lesson, one should choose carefully Ultimately--call me an optimist, perhaps--I have faith that if reliable information is important, someone will seek it. Eventually, they will find how to sort the reliable information from the rubbish. If they care. That is, they will figure out all the scientific methods that have emerged from centuries of meta-scientific learning: the role of empirical evidence, the virtue of accurate measurements, the need for controlled experiment, double-blind studies, statistical analysis of error, honest reporting, et cetera. Science will be able to reassemble itself on a deserted isle, if knowledge is important at all.

That might mean that the primary lesson should be an appreciation of science, respect for truth, and enthusiasm for seeking knowledge: more affective than cognitive. Indeed, I regard this goal as high among many teachers' reasons for teaching--although reward may be scarce for acknowledging so publically. Parents, however, often seem mindful of the value of this lesson, possibly the occasion for the most frequent unsolicited (and welcome) expressions of gratitude that teachers receive. Honoring this third option as the number one science lesson may be as revolutionary as the former. Imagine the core of science being more about emotion than reason or intellect. Wow, that would step on some sacred bovine toes.

My own candidate for "Most Important Science Lesson" (MISL), however, departs from all these fine proposals. Foremost, it shifts focus from the process of science one layer deeper, to the "nature of science" that sometimes vague set of concepts about science and how science works--or, in this case, how science does not work.

The nature of science was (re)established as an important benchmark in science education in several important reform documents in the 1990s, from the National Research Council (1996) and the American Association for the Advancement of Science (1993) to BSCS (1993). But declaring its importance did not mean that characterizing it was easy, nor that we knew well how to teach it, let alone assess student understanding of it. Accordingly, the recommendations are still winding their way through the system, surfacing in many state standards, but leaving many, including classroom teachers, uncertain about how best to proceed.

What do students need to know about the nature of science? Characterizations of the nature of science have varied over the past century, leaving one to wonder if it is all subject to cultural whims and shifting popular attitudes about science. Yet one element has persisted as central throughout: typically expressed as "the tentativeness of science" (Lederman et al., 1998; Osborne et al., 2003). Namely, scientists can be wrong. Even Nobel Prize winners. Yes, even Darwin (Sacred Bovines, October 2008, February 2009).

Declaring that "science is tentative" alone, however, is vacuous. Critics of evolution, for example, frivolously dismiss the robust evidence and denounce Darwinians as "dogmatic" (Allchin, 2001) while appealing to a principle of tentativeness. Likewise, naysayers believe that it justifies denying global climate change: purportedly an unreliable overstatement of uncertain data amplified by uncertain models (Oreskes & Conway 2010). Such cases indicate that merely asserting that "science is tentative" does not help. One needs to understand how or why science, or scientists--or any individual, for that matter--can err in thinking.

The MISL I propose, then? Recognizing how we can each err in our thinking and, more importantly, developing skills to counterbalance or remedy the tendency to err. In scientific practice, this is embodied in a habit of searching for, and addressing, sources of error.

The potential sources of error in science are many. Some are experimental: an uncalibrated instrument, a missed control, inadequate sample size. Some are observational: when our senses play tricks on us, or when our expectations bias our perceptions. Some errors occur in reasoning: jumping to a conclusion before sufficient evidence warrants it, or interpreting correlation as causation. Some are social: the reputation of a famous researcher overshadowing counterclaims, or a fraudulent study undetected in peer review. Some are cultural: gender or race shaping how one interprets what are ultimately indefinite results, or sources of funding supporting some research that eclipses work on alternative theories. The methods of science are, in many ways, our hard-won historical heritage of ways to prevent, mitigate, or accommodate such errors.

Yet among all possible errors, one seems more severe--and diabolical--than the rest. The error is cognitive. That is, it seems to reflect how our brains work, unmonitored. The error is widely documented by psychologists, who generally frame it as one of our basic cognitive flaws, fundamental to a wide range of other cognitive missteps. The MISL error is this: adopting the first available idea, which then subtly governs subsequent perceptions, analyses, and conclusions. This error is typically called 'confirmation bias', sometimes also 'the availability error', 'the primacy effect', 'belief persistence', 'positivity bias', or the 'congruence heuristic' (Gilovich, 1991; Sutherland, 1992; Nickerson, 1998). First impressions matter immensely.

The error's effect is far-reaching. Prior beliefs and information are powerful filters. For example, using earlier mental patterns, we select or highlight confirming examples, and discount or peripheralize counterexamples. The very evidence we collect in an effort to be objective may be inherently biased. Also, we tend to draw conclusions before all the relevant evidence is available. Indeed, we will not even notice that the evidence is incomplete. Typically, we entertain or pursue only one hypothesis at a time, shuttering off awareness of alternative interpretations of the same information. In all these ways we tend to mislead ourselves--and all these lapses appear in the history of science.

None of this is conscious, of course. The whole process is quite insidious. The functioning brain hides one of its greatest weaknesses. It is a cognitive blindspot. We may not be thinking straight, even when we think we are. Individual scientists, too. As champion skeptic Michael Shermer (2002) notes, smart people, in particular, are very good at rationalizing their beliefs--ironically, more effectively than others--and so their ill-informed beliefs can become exceptionally well entrenched (pp. 296-302). Learning about this handicap for oneself is unlikely, and for this very reason the error is a prime lesson in science education.

Philosopher Karl Popper did not seem to have confirmation bias in mind when he profiled a role for falsification in science, yet his ideas resonate with the problem. Confirmation, by itself, is susceptible to error, both logically and psychologically. We need to search mindfully for exceptions and counterevidence: potentially "falsifying" examples. Popper thus advocated severe tests: rigorously designed to help expose one's own errors, if they were present (Mayo, 1996). That was how to achieve confidence in scientific findings. Engaging criticism and alternatives is essential--and requires deliberate action.

The negative effect of prior beliefs permeates all types of thinking. Consider one psychologist's assessment in introducing a comprehensive review of the research literature:

If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations. (Nickerson, 1998: p. 175)

Wow. MISL, indeed.

So, how does one cope with this awesome cognitive challenge? First--and this is the foremost reason for placing it squarely at the heart of a biology curriculum--one needs to be aware of how one's own brain works and of its potential for mistakes. Even at the very point where we think the evidence is most secure, we may be mistaken. Too often, imagined justification is merely superficial rationalization. In addition, we tend to attribute bias to others, not ourselves. We need to instill a deep appreciation of the fallibility of our minds. Our minds--not other people's minds. That opens the way to critical self-analysis and self-regulation.

Second, one needs to "test" conclusions not against the evidence alone, but against the evidence presented by others. Alternative perspectives matter. Sound conclusions may involve some hefty listening. (And that, in turn, may involve tolerance, valuing heterogeneous perspectives, and even a habit of seeking contrasting views.) Responsible claims include engaging critics. "Critical thinking" relies less on being "critical" than on listening well to criticism. Yes, error can be exposed and weeded out: most likely socially, through a system of checks and balances. Scientific knowledge edges forward.

Teaching about error may seem counterintuitive. Isn't a central goal of most education to teach how to think well, how to analyze and trust evidence? Why waste precious time dwelling on the opposite? But imagine practicing medicine without understanding disease, or enforcing law without understanding crime. This is the sacred bovine: the unquestioned faith that we can learn to think well... well, without thinking. We assume that our brains function normally without error. And that science is thus inherently and spontaneously self-correcting.

Becoming aware of unconscious cognitive biases seems essential to effective, fully informed analytical thinking. Indeed, learning how preconceptions shape all our thinking seems a critical tool for leveraging effective learning of everything else. Still, the tool is worthless if you do not know about it or know how to use it.

To my mind, every prospective thinker deserves a user's guide: Your Brain & How to Use It. Of course, every owner's manual needs a section on troubleshooting. That's where the lessons on error fit. Confirmation bias seems to merit a whole chapter of its own. Fixing mistakes takes work. Only through methods-beyond-the-methods can science effectively correct itself.

Ultimately, the simple MISL game helps underscore the poverty of current content-based mass examinations. It may also help invigorate efforts to displace them with concrete skills in "personal and social decision making" that involves science. Learning to think is a valuable first step. But it is not enough. (Do the math?) Students also need to learn how to "think twice."

DOI: 10.1525/abt.2010.72.6.17

References

Allchin, D. (2001). The emperor's old clothes. American Biology Teacher, 63, 635-636.

Allchin, D. (2008). Nobel ideals & noble errors. American Biology Teacher, 70, 389-392.

Allchin, D. (2009). Celebrating Darwin's errors. American Biology Teacher, 71, 116-119.

American Association for the Advancement of Science. (1993). Benchmarks for Scientific Literacy. New York, NY: Oxford University Press.

BSCS. (1993). Developing Biological Literacy. Dubuque, IA: Kendall Hunt.

Gilovich, T. (1991). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York, NY: Free Press.

Lederman, N.G., Wade, P.D. & Bell, R.L. (1998). Assessing the nature of science: what is the nature of our assessments? Science & Education, 7, 595-615.

Mayo, D. (1996). Error and the Growth of Experimental Knowledge. Chicago, IL: University of Chicago Press.

National Research Council. (1996). National Science Education Standards. Washington, DC: National Academy Press.

Nickerson, R.S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.

Oreskes, N.& Conway, E.M. (2010). Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. New York, NY: Bloomsbury.

Osborne, J., Collins, S., Ratcliffe, M., Millar, R. & Duschl, R. (2003). What "ideas-about-science" should be taught in school science? A Delphi study of the expert community. Journal of Research in Science Teaching, 40, 692-720.

Shermer, M. (2002). Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time, 2nd Ed. New York, NY: W.H. Freeman/Henry Holt.

Sutherland, S. (1992). Irrationality: Why We Don't Think Straight! New Brunswick, NJ: Rutgers University Press.

DOUGLAS ALLCHIN, DEPARTMENT EDITOR

DOUGLAS ALLCHIN has taught both high school and college biology and now teaches History of Science at the University of Minnesota, Minneapolis, MN 55455; e-mail:allch001@umn.edu. He is a Fellow at the Minnesota Center for the Philosophy of Science and edits the SHIPS Resource Center (ships.umn.edu). He hikes, photographs lichen, and enjoys tea.
Gale Copyright: Copyright 2010 Gale, Cengage Learning. All rights reserved.


 
Previous Article: Evolution Mosaic.
Next Article: Letters.