On 3/5/07, Stathis Papaioannou <[EMAIL PROTECTED]> wrote:


You seem to be equating intelligence with consciousness. Ned Block also
seems to do this in his original paper. I would prefer to reserve
"intelligence" for third person observable behaviour, which would make the
Blockhead intelligent, and "consciousness" for the internal state: it is
possible that the Blockhead is unconscious or at least differently conscious
compared to the human.


I think the argument also works for consciousness but I don't think you're
right if you are suggesting that our ordinary notion of intelligence is
merely third person observable behavior. (If you really were just voicing
your own idiosyncratic preference for how you happen to like to use the term
"intelligence" then I guess I don't really have a problem with that so long
as you are clear about it.)

Suppose that when many people try to evaluate a Turing test, they end up
asking pretty much the same basic set of questions. Given that these people
are predictably asking the same types of questions, I think we can fairly
easily imagine a feasible program that operates much like Blockhead. There's
not much more going on than just spitting back canned answers. Now when a
human gets those questions and comes up with a response, there will be
things like a simulated model of parts of the world in imagination
constituted by concepts ultimately grounded in useful ways of categorizing
and differentiating various perceptual inputs.

Now it could be that in terms of externally observable behavior, there's no
appreciable difference in the quality of their responses. It may be that
this isn't a complex enough environment for either subject to demonstrate a
high degree of intelligence. But even so, I would argue that our ordinary
notion of intelligence would count the human way of responding as much
*more* intelligent than the computer's. (Which is not to suggest that I
think the human way is the *only* way that can count as intelligent, just
that there are some substantive criteria on the internal processing.)

And even if you don't agree that this is how our ordinary notion of
intelligence operates, I think it is clear that it would be useful to have
some term or other that marks these sorts of distinctions between e.g. what
I would call real understanding and mere memory recall. And furthermore, I
think we want a term that marks these purely computational distinctions in
internal processing while setting aside the very difficult and messy issue
of which things to call conscious or to what degree. Finally, regardless of
what the ordinary notion of intelligence is, I would argue that *this
notion* would be the one that is more interesting and worth investigating.

-Ku

http://www.umich.edu/~jsku

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=11983

Reply via email to