modest effort to summarize Yudkowsky's writings,
which express all this better than I do.)
Joshua
2007/5/27, Abram Demski < [EMAIL PROTECTED]>:
>
> Joshua Fox,could you give an example scenario of how an AGI
> theorem-prover would wipe out humanity?
--
Joshua Fox,could you give an example scenario of how an AGI theorem-prover
would wipe out humanity?
On 5/27/07, Richard Loosemore <[EMAIL PROTECTED]> wrote:
Joshua Fox wrote:
> [snip]
> When you understand the following, you will have surpassed most AI
> experts in understanding the risks: If t