Two different responses to this type of arguement.

Once you "simulate" something to the fact that we cant tell the difference 
between it in any way, then it IS that something for most all intents and 
purposes as far as the tests you have go.
If it walks like a human, talks like a human, then for all those aspects it is 
a human.

Second, to say it CANNOT be programmed, you must define IT much more closely.  
For cutaneous pain and humans, it appears to me that we have pain sensors, so 
if we are being pricked on the arm, the nerves there send the message to the 
brain, and the brain reacts to it there.

We an recreate this fairly easily using VNA with some robotic touch sensors, 
and saying that "past this threshhold" it becomes "painful" and can be 
damaging, and we will send a message to the CPU.

If there is nothing "magical" about the pain sensation, then there is no reason 
we cant recreate it.

James Ratcliff


Jiri Jelinek <[EMAIL PROTECTED]> wrote: Mark,

Again, simulation - sure, why not. On VNA (Neumann's architecture) - I
don't think so - IMO not advanced enough to support qualia. Yes, I do
believe qualia exists (= I do not agree with all Dennett's views, but
I think his views are important to consider.) I wrote tons of pro
software (using many languages) for a bunch of major projects but I
have absolutely no idea how to write some kind of feelPain(intensity)
fn that could cause real pain sensation to an AI system running on my
(VNA based) computer. BTW I often do the test driven development so I
would probably first want to write a test procedure for real pain. If
you can write at least a pseudo-code for that then let me know. When
talking about VNA, this is IMO a pure fiction. And even *IF* it
actually was somehow possible, I don't think it would be clever to
allow adding such a code to our AGI. In VNA-processing, there is no
room for subjective feelings. VNA = "cold" data & "cold" logic (no
matter how complex your algorithms get) because the CPU (with its set
of primitive instructions) - just like the other components - was not
designed to handle anything more.

Jiri

On 6/10/07, Mark Waser  wrote:
>
>
> > For feelings - like pain - there is a problem. But I don't feel like
> > spending much time explaining it little by little through many emails.
> > There are books and articles on this topic.
>
> Indeed there are and they are entirely unconvincing.  Anyone who writes
> something can get it published.
>
> If you can't prove that you're not a simulation, then you certainly can't
> prove that "pain that really *hurts*" isn't possible.  I'll just simply
> argue that you *are* a simulation, that you do experience "pain that really
> *hurts*", and therefore, my point is proved.  I'd say that the burden of
> proof is upon you or anyone else who makes claims like ""Why you can't make
> a computer that feels pain".
>
> I've read all of Dennett's books.  I would argue that there are far more
> people with credentials who disagree with him than agree.  His arguments
> really don't boil down to anything better than "I don't see how it happens
> or how to do it so it isn't possible."
>
> I still haven't seen you respond to the simulation argument (which I feel
> *is* the stake through Dennett's argument) but if you want to stop debating
> without doing so that's certainly cool.
>
>     Mark________________________________
>  This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;



_______________________________________
James Ratcliff - http://falazar.com
Looking for something...
       
---------------------------------
Choose the right car based on your needs.  Check out Yahoo! Autos new Car 
Finder tool.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to