On 10 October 2013 13:03, Craig Weinberg <whatsons...@gmail.com> wrote:

>
> On Wednesday, October 9, 2013 5:52:46 PM UTC-4, Liz R wrote:
>
>> On 10 October 2013 09:47, Craig Weinberg <whats...@gmail.com> wrote:
>>
>>> It's not that computers can't do what humans do,* it's that they can't
>>> experience anything.* Mozart could dig a hole as well as compose music,
>>> but that doesn't mean that a backhoe with a player piano on it is Mozart.
>>> It's a much deeper problem with how machines are conceptualized that has
>>> nothing at all to do with humans.
>>>
>>
>> So you think "strong AI" is wrong. OK. But why can't computers experience
>> anything, in principle, given that people can, and assuming people are
>> complicated machines?
>>
>
> I don't think that people are machines. A machine is assembled
> intentionally from unrelated substances to perform a function which is
> alien to any of the substances. Living organisms are not assembled, they
> grow from a single cell. They have no unrelated substances and all
> functions they perform are local to the motives of the organism as a whole.
>

I believe that, at least in discussions such as this one, defining people
as machines has nothing to do with how or why they are constructed, and
eveything to do with ruling out any supernatural components. Anyway, allow
me to rephrase the question.

I assume from the underlined comment that you think that strong AI is
wrong, and that we will never be able to build a conscious computer. How do
you come to that conclusion?

>
> This is an even bigger deal if I am right about the universe being
> fundamentally a subdividing capacity for experience rather than a place or
> theater of interacting objects or forces. It means that we are not our
> body, rather a body is what someone else's lifetime looks like from inside
> of your lifetime. It's a token. The mechanisms of the brain do not produce
> awareness as a product, any more than these combinations of letter produce
> the thoughts I am communicating. What we see neurons doing is comparable to
> looking at a satellite picture of a city at night. We can learn a lot about
> what a city does, but nothing about who lives in the city. A city, like a
> human body, is a machine when you look at it from a distance, but what we
> see of a body or a city would be perfectly fine with no awareness happening
> at all.
>

Insofar as I understand it, I agree with this. I often wonder "how a load
of atoms can have experiences" so to speak. This is the so-called hard
problem of AI. It is (I think) addressed by comp.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to