Ethics only becomes snarled when one is unwilling to decide/declare what the
goal of life is.
Extrapolated Volition comes down to a homunculus depending upon the definition
of wiser or saner.
Evolution has "decided" what the goal of life is . . . . but most are unwilling
to accept it (in part
When transhumanists talk about indefinite life extension, they often take
care to say "it's optional" to forestall one common objection.
Yet I feel that most suicides we see should have been prevented -- that the
person should have been taken into custody and treated if possible, even
against thei
Kaj Sotala wrote:
On 1/29/08, Richard Loosemore <[EMAIL PROTECTED]> wrote:
Summary of the difference:
1) I am not even convinced that an AI driven by a GS will ever actually
become generally intelligent, because of the self-contrdictions built
into the idea of a goal stack. I am fairly sure th
On 1/29/08, Richard Loosemore <[EMAIL PROTECTED]> wrote:
> Summary of the difference:
>
> 1) I am not even convinced that an AI driven by a GS will ever actually
> become generally intelligent, because of the self-contrdictions built
> into the idea of a goal stack. I am fairly sure that whenever
Richard Hollerith said:
> If I am found dead with a bag over my head attached to helium or
> natural gas, please investigate the possibility that it was a
> murder made to look like a suicide.
>
> --
> Richard Hollerith
> http://dl4.jottit.com
>
Same here Richard. Nitrous Oxide would de
If I am found dead with a bag over my head attached to helium or
natural gas, please investigate the possibility that it was a
murder made to look like a suicide.
--
Richard Hollerith
http://dl4.jottit.com
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change