Brian Atkins wrote:
So if you choose for example 99%, then that would mean you want there to be at a minimum only a 1% chance the MBFAI would run amok during each
Couple of quick notes. (1) I didn't edit that sentence above well, so just mentally delete "at a minimum" from it. (2) Let's open this survey question up to also include all potential forms of superintelligence, not just AGIs. So, we'll include superintelligent uploads, human/AGI hybrid systems, etc. Whatever your favored most likely scenario is, give me your minimum required success percentage.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.singinst.org/ ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]