Jeff Ricker wrote:
> It seems to me that a particular motivation--the need for
> certainty--is a primary determinant of the spread of irrational popular
> beliefs within the wider culture. My classroom experience has led me to
this
> conclusion. When I discuss and criticize various possible explanations
> of a phenomenon (i.e., when I emphasize the complexities of a
> problem), I sometimes hear exclamations of frustration from students.

        A few years ago a book came out with a title something like "Descartes'
Error". That book identified Descartes' dualism as his error, but I believe
that his mistaken insistence on certainty has had as much negative impact on
our thinking as his dualism. In the Discourse on Method (1637), he wrote
that he had decided that he should reject any beliefs that he had not
established with certainty so that he might "better succeed in the conduct
of [his] life". We now know that we don't directly perceive causal
relationships - that they're fallibly inferred, instead, from our
experiences. A student who subscribes to Descartes' insistence on certainty
would surely be devastated to discover that the causal beliefs she assumes
are irrefutably established by her personal experience are in fact open to
question (and therefore to "swept away", to borrow Descartes' words).
Contrary to Descartes' belief, the act of rejecting such beliefs will surely
cause far more problems that it will resolve. The student faced with the
claim that she cannot directly perceive causal relationships must either

1. Accept that most of her causal beliefs are subject to question, retain
her insistence on certainty, and therefore abandon most of her causal
beliefs (not a very functional outcome, to say the least!),

2. Accept that most of her causal beliefs are subject to question, live with
those beliefs and abandon her insistence on certainty (this is commonly
known as learning to "tolerate uncertainty" or "tolerate ambiguity"), or

3. Retain her belief that her causal beliefs are certain, retain her
insistence on certainty, and reject the new claim that she cannot directly
and indubitably perceive causal relationships (in short, reject psychology
in favor of her preconceptions).

> In general, we humans (perhaps especially the more intellectually
> curious among us) have a strong desire to know the correct answers to
> questions. Thus, in our courses, we sometimes talk about a personality
> characteristic essential for developing valid knowledge: being
> comfortable with uncertainty. The fact that we mention this
> attribute at all implies that we understand how rare it is to have. We
seem to
> believe that, by exhorting our students to become more
> comfortable with uncertainty, we may help this trait to grow within them.

        I don't think that comfort with uncertainty is modeled NEARLY enough, nor
praised nearly enough in the broader culture. "Being uncertain" is more
likely to be seen as a character FLAW than a sign of strength of character.
For example, Alexander and Dochy (1994) asked undergraduate honors students,
graduate students in educational psychology, and educational researchers
investigating knowledge or beliefs to talk about their understandings of the
relationship between beliefs and knowledge. The undergraduates typically
emphasized differences between school knowledge and their personal beliefs,
and despite the fact that they were honors students, indicated that they
felt that resistance to belief change was a sign of strong character.

Alexander, P. A., & Dochy, F. J. R. C. (1994). Adults' views about knowing
and believing. In R. Garner and P. A. Alexander (Eds.), Beliefs about text
and instruction with text. Hillsdale, NJ: Lawrence Erlbaum Associates. (this
is a really wonderful book!)

> Our students learn them as matters
> of unquestioned and unquestionable fact. The major driving
> force for the rapid spread and consolidation of the belief, it seems to
me, is the
> strong human desire for certainty. In addition to this motivational
> determinant, there also is a cognitive determinant: the
> belief must also be consistent with the set of relevant beliefs already
held by the
> audience if they are to readily accept the new belief.

        I strongly suspect (he writes, once again...) that "being certain" has more
survival value than "being right", particularly when what you're certain
about is something that those around you generally agree is true. Much as I
harp on the inadequacy of unorganized personal experience in evaluation of
causal beliefs, it would surely be more disfunctional to obsessively do
proper evaluations of all of one's causal beliefs (ala Descartes' 2nd, 3rd,
and 4th "Rules for the Direction of the Mind").

> There are two questions I am pondering at this point: (1) Can we teach
> students to be more comfortable with uncertainty, or is this just
> something that you either have or you don't? (2) How is this
> characteristic related to a desire for knowledge (the so-called "need
> for cognition")?

        My best evidence for a "yes" answer to #1 is the fact that many of us
(myself, at least, and I doubt that I'm anything special) learned somehow to
tolerate uncertainty. I'm quite sure (irony intended) that I used to insist
on certainty, just as Descartes did, and the students in question still do
(what a dolt I was!). But somewhere along the line, I learned to be
comfortable with uncertainty. Now, just because something can be learned
doesn't necessarily mean that it can be taught, but it seems a pretty good
challenge to the notion that this comfort with uncertainty is an innate
difference. (I suppose that rather than being learned, it could be an innate
predisposition that emerges over time like facial hair, but that doesn't
seem too likely).

        As for the second question, I once hastily wrote a rather shoddy related
research proposal. Here's a piece from it (with some good references):
------------------------
        Besides these generic factors, there are personality factors that may also
affect conceptual change. Cacioppo and Petty (1982) proposed that persons
differ reliably on the tendency to expend effort on cognition. This
tendency, known as Need for Cognition, has been shown to be reliably
predictive of the degree to which persons are affected by the quality of an
argument (Cacioppo et al., 1996) in forming beliefs. However, once an
initial belief has formed, persons high in Need for Cognition tend to be
less likely to change their minds in response to new information. Haugtvedt
and Petty (1992) measured high and low Need for Cognition subjects' beliefs
about the safety of a particular product prior to bringing the subjects to
the lab. Once in the lab, they exposed subjects to a persuasive message
about the product's safety, and measured their beliefs a second time. Both
high and low Need for Cognition subjects changed their beliefs about the
product in response to the messages. Then Haugtvedt and Petty presented a
second message contrary to the first, and measured the subjects' beliefs a
final time. The low Need for Cognition subjects returned to their initial
beliefs about the product's safety, while the high Need for Cognition
subjects maintained the beliefs they had formed after the first exposure to
 persuasive messages.
        However, Haugtvedt and Petty (1992) recognized that very strong
counterarguments might reverse this pattern, evoking more rather than less
conceptual change among high (as opposed to low) Need for Cognition
subjects. Miller et al. (1996) used a counterattitudinal advocacy method to
reverse subjects' misconceptions about psychology. After identifying their
specific misconceptions, the subjects wrote essays supporting the position
opposite to the initial misconceptions. This technique resulted in greater
conceptual change than mere course coverage did, while mere course coverage
resulted in greater conceptual change than reading another student's essay.
        This counterattitudinal advocacy method would seem to be effective in
raising the conditions suggested by Posner et al. (1982) as prerequisites
for conceptual change, in particular forcing the students to find
consistencies between the new conception and the students' current
conceptions. Because writing a counterattitudinal essay taps effortful
cognition, one would expect this method to be especially effective in high
Need for Cognition subjects. This is the hypothesis to be tested in the
proposed study.

References
Brown, L. T. (1983). Some more misconceptions about psychology among
introductory psychology students. Teaching of Psychology, 10, 207-210.

Cacioppo, J. T., & Petty, R. E. (1982). The need for cognition. Journal of
Personality and Social Psychology, 42, 116-131.

Cacioppo, J. T., Petty, R. E., Feinstein, J. A., & Jarvis, W. B. G. (1996).
Dispositional differences in cognitive motivation: The life and times of
individuals varying in need for cognition. Psychological Bulletin, 119,
197-253.

Cacioppo, J. T., Petty, R. E., & Kao, C. F. (1984). The efficient assessment
of need for cognition. Journal of Personality Assessment, 48, 306-307.

Chambliss, M. J. (1994). Why do readers fail to change their beliefs after
reading         persuasive text? In R. Garner and P. A. Alexander (Eds.), Beliefs
about text and instruction with text. Hillsdale, NJ: Lawrence Erlbaum
Associates, Publishers.

Driver, R., Asoko, H., Leach, J., Mortimer, E., & Scott, P. (1994).
Constructing scientific knowledge in the classroom. Educational Researcher,
7, 5-12.

Gil-Perez, D., & Carrascosa, J. (1990). What to do about science
"misconceptions".       Science Education, 74, 531-540.

Haugtvedt, C. P., & Petty, R. E. (1992). Personality and persuasion: Need
for cognition moderates the persistence and resistance of attitude changes.
Journal of Personality and Social Psychology, 63, 308-319.

Herron, J. D. (1990). Research in chemical education: Results and
directions. In M.Gardner, J. G. Greeno, F. Reif, A. H. Schoenfeld, A.
diSessa, and E. Stage (Eds.), Toward a scientific practice of science
education. Hillsdale, NJ: Lawrence Erlbaum Associates.

Kardash, C. M., & Scholes, R. J. (1996). Effects of preexisting beliefs,
epistemological         beliefs, and need for cognition on interpretation of
controversial issues. Journal of Educational Psychology, 88, 260-271.

Lamal, P. A. (1979). College students' common beliefs about psychology.
Teaching of Psychology, 6, 155-158.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and
attitude polarization: The effects of prior theories on subsequently
considered evidence. Journal of Personality and Social Psychology, 37,
2098-2109.

McDermott, L. C. (1990). A view from physics. In M. Gardner, J. G. Greeno,
F. Reif, A. H. Schoenfeld, A. diSessa, and E. Stage (Eds.), Toward a
scientific practice of  science education. Hillsdale, NJ: Lawrence Erlbaum
Associates.

Miller. R. L., Wozniak, W. J., Rust, M. R., Miller, B. R., & Slezak, J.
(1996). Counterattitudinal advocacy as a means of enhancing instructional
effectiveness: How to teach students what they do not want to know. Teaching
of Psychology, 23, 215-219.

National Research Council (1996). National science education standards.
Washington, DC: National Academy Press.

Olson, D. R., & Astington, J. W. (1993). Thinking about thinking: Learning
how to take statements and hold beliefs. Educational Psychologist, 28, 7-23.

Posner, G., Strike, K., Hewson, P., & Gertzog, W. (1982). Accommodation of a
scientific conception: Towards a theory of conceptual change. Science
Education, 66, 221-227.

Sadowski, C. J. (1993). An examination of the short need for cognition
scale. The Journal of Psychology, 127, 451-454.

Vaughan, E. D. (1977). Misconceptions about psychology among introductory
psychology students. Teaching of Psychology, 4, 138-141.

------------------------

        Hope this helps.

Paul Smith
Alverno College
Milwaukee

Reply via email to