Thus spake Ian P. Cook circa 27/01/09 06:59 AM:
> At the risk of pointing in yet another bad direction, it seems to me that a
> cognate to this problem could be the Rosenthal's "file drawer" issue in
> research; i.e. the work that goes unsubmitted out of a (correct or
> incorrect) assumption that it won't get a fair hearing because it runs
> counter to other, published literature. Though the idea is from '79, it's
> still of great concern.

Yes, that would be relevant.  In fact, if I could learn more about that,
then it would help me argue my point because it would help separate out
personal, psychological bias (which might keep an individual or small
group from even trying to publish) from collective, objective bias (that
may not be recognizable or perceivable by any one individual in the
collective).

My guess is that my friend and many of the anthropogenic global climate
change skeptics perceive a bias where none exists.  Likewise, I guess
that the global climate change believers have jumped to a premature
conclusion.  So, personal, psychological bias would play a huge role in
the irrationality of both positions, regardless of any objective bias
that may exist.  [Disclosure:  I'm a believer; but my premature
conclusion is definitely based on intuition, not fact.  I'm not a
believer in anthropogenic global warming, however, only climate change.]

> I once had a long discussion with an official from
> NIH who was seeking funding for research into the publication bias issue
> (see also the area worked on by Ioannidas -- http://xrl.in/1h2j, which I
> offer here not as endorsement or condemnation, but simply as an example).

That link resulted in an error page; but a search on PLoS using
"Ioannidis" returns results.  Thanks!

> And, as remote from my real knowledge as anything could be, as a mildly
> interested observer it seems that there might be something of this question
> of inertia in dominant ideas that appears in theoretical physics hiring.
> Again, I warn that this is from a simplistic reading of Lee Smolin's
> discussion of the issue, and absolutely no personal experience. Simply
> things that occured to me in reading your note.

Yep.  I'm sure it appears everywhere.  But it's most important in
science, where it's our stated purpose to arrive at conclusions based on
repeatable falsification (and, to some extent, the shape and color of
justification) rather than intuition.  Note that I'm not disparaging
intuition for the formation of [hypo]theses or of ways to test them,
only the conclusions surrounding them.

In fact, I'd probably argue that bias (both personal and collective) in
hiring is a good thing.  It's probably good in all human activity,
perhaps even science.  But it's in science where we most often claim to
be rational.

In any case, I'd like to be able to discuss these topics in the concrete
context of the anthropogenic global climate change hypothesis,
especially before we falsify it, if we ever do. [grin]

-- 
glen e. p. ropella, 971-222-9095, http://agent-based-modeling.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to