On 12/23/2013 04:23 PM, Roger Critchlow wrote:
> I
> wondered how they would feel about having been subjected to thousands of
> generations of this torture when they realized how we had grown them.
>  There are two questions, of course: whether it's moral to torture
> pre-sentients to bring them to sentience; and whether the resulting
> super-sentient will forgive you when it becomes the master.

Of course, that all begs the definition of "sentience" in the first
place.  I'd claim it's just as moral to torture an eventual-sentient as
it is to torture, say, an embryo ... or even a fetus.  And it's a smooth
scale all the way unto the tortured's death.  If you'll permit me some
poetic license:  It's just as moral to torture an eventual-sentient as
it is to 80-year-old-torture a 70-year-old. [*]

My point being the banal one that "life is pain", or perhaps that
sentience is pain.  And if our so-called sentient AIs are not sentient
enough to understand that, then they're not sentient.

Or, we can just take this rhetoric at face value:

   http://elfs.livejournal.com/1197817.html

> As I pointed out in an earlier story, this is the moral equivalence of the 
> following mind experiment: say you've created a being (meat or machine, I 
> don't care, I'm not er, "materialist" has already been taken. Someone help me 
> out here) that, when you bring it to consciousness, will experience enormous 
> pain from the moment it is aware. Your moral obligation before that moment is 
> exactly nil: the consciousness doesn't exist, you don't have a moral 
> obligation toward it. You are not obliged to assuage the pain of the 
> non-existent; even more importantly, you are not obliged to bring it into 
> existence. Avoiding the instantiation of suffering creature is meant to make 
> the humans feel good about themselves, but it's not sufficient or even 
> necessary foundation for AI morality.

[*] Of course, it also begs the definition of "torture"... But I think
parsing that word leads you down rat-holes and towards intelligent
design, or at least justificationist rationalizing.  At its clearest, I
think I can say if the tortured is killed and doesn't reach the next
stage of development, then it wasn't really "torture", per se, it was
killing.  And that raises the specter: Would you rather a slow or quick
death?  For me, I think the answer is undoubtedly the slow one,
preferably really, really, really slow ... like 80-90 years or so. ;-)

-- 
⇒⇐ glen

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

Reply via email to