On Aug 7, 7:42 am, Stathis Papaioannou <stath...@gmail.com> wrote:
> On Sun, Aug 7, 2011 at 11:17 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
> > On Aug 6, 7:40 pm, Stathis Papaioannou <stath...@gmail.com> wrote:
>
> >> When you are online you don't analyse the biochemical make-up of you
> >> interlocutor, but you still come to a conclusion as to whether they
> >> are intelligent or not. If in doubt you can always ask a series of
> >> questions: I'm sure you are confident in your ability to tell the
> >> difference between a person and a bot. But there may come a time when
> >> it is impossible in general to tell the difference,
>
> > Why does that matter though? What does being able to tell the
> > difference between a bot and a person have to do with a bot feeling
> > like a person?
>
> That, as I keep saying, is the question. Assume that the bot can
> behave like a person but lacks consciousness.

No. You have it backwards from the start. There is no such thing as
'behaving like a person'. There is only a person interpreting
something's behavior as being like a person. There is no power
emanating from a thing that makes it person-like. If you understand
this you will know because you will see that the whole question is a
red herring. If you don't see that, you do not understand what I'm
saying.

>Then it would be
> possible to replace parts of your brain with non-conscious components
> that function otherwise normally, which would lead to you lacking some
> important aspect aspect of consciousness but being unaware of it. This
> is absurd, but it is a corollary of the claim that it is possible to
> separate consciousness from function. Therefore, the claim that it is
> possible to separate consciousness from function is shown to be false.
> If you don't accept this then you allow what you have already admitted
> is an absurdity.

It's a strawman of consciousness that is employed in circular
thinking. You assume that consciousness is a behavior from the
beginning and then use that fallacy to prove that behavior can't be
separated from consciousness. Consciousness drives behavior and vice
versa, but each extends beyond the limits of the other.

> > and then we will
> >> have human level AI (soon after we will have superhuman AI and soon
> >> after that the human race may be supplanted, but that's a separate
> >> question).
>
> > The human race has already been supplanted by a superhuman AI. It's
> > called law and finance.
>
> They are not entities and not intelligent, let alone intelligent in
> the way humans are.

What make you think that law and finance are any less intelligent than
a contemporary AI program?

> >> > I don't understand what all of this debate over how intelligence seems
> >> > from the outside has to do with how it is experienced from the inside.
> >> > Here's a thought experiment for the anti-zombie. If I study randomness
> >> > and learn to impersonate machine randomness perfectly, have I become a
> >> > machine? Have I lost sentience? Why not?
>
> >> Intelligence can fake non-intelligence, but non-intelligence can't
> >> fake intelligence.
>
> > But intelligence can fake intelligence using non-intelligence. A
> > computer isn't faking intelligence, it's just spinning a quantitative
> > instruction set through semiconductors. It's only us who think it's
> > intelligent. In fact it is intelligent, as a long polymer molecule is
> > intelligent, but it is not conscious as an animal is conscious.
>
> It seems that you are conflating intelligence with consciousness.
> Intelligence is what is observed, while consciousness relates to the
> internal experience. A zombie is intelligent but not conscious.

When you say that intelligence can 'fake' non-intelligence, you imply
an internal experience (faking is not an external phenomenon).
Intelligence is a broad, informal term. It can mean subjectivity,
intersubjectivity, or objective behavior, although I would say not
truly objective but intersubjectively imagined as objective. I agree
that consciousness or awareness is different from any of those
definitions of intelligence which would actually be categories of
awareness. I would not say that a zombie is intelligent. Intelligence
implies understanding, which is internal. What a computer or a zombie
has is intelliform mechanism.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to