RE: [agi] Re: AI boxing

2004-09-19 Thread Philip Sutton
Hi Ben, > One thing I agree with Eliezer Yudkowsky on is: Worrying about how to > increase the odds of AGI, nanotech and biotech saving rather than > annihilating the human race, is much more worthwhile than worrying > about who is President of the US. It's the nature of evolution that getting

RE: [agi] Re: AI boxing

2004-09-19 Thread Arthur T. Murray
On Sun, 19 Sep 2004, Philip Sutton wrote: > Hi Ben, > > > One thing I agree with Eliezer Yudkowsky on is: Worrying about > > how to increase the odds of AGI, nanotech and biotech saving > > rather than annihilating the human race, is much more worthwhile > > than worrying about who is President o

RE: [agi] Re: AI boxing

2004-09-19 Thread Ben Goertzel
Arthur, This sort of thing does not belong on the AGI mailing list, as it has no direct relationship to AGI! thanks Ben p.s. As an aside, I don't think this slogan has much political potential. The US public knows Dubya used to have a drug problem, and has forgiven him for it now that he's f

RE: [agi] Re: AI boxing

2004-09-19 Thread Ben Goertzel
Philip, But the connection between present events and the Singularitarian future isn't all that easy to calculate. Bush, in addition to carrying out his questionable acts of foreign policy & tax reform, will probably boost DARPA's budget more than Kerry would. Perhaps this will result in more mo

RE: [agi] Psychometric AI

2004-09-19 Thread Yan King Yin
> I noticed that too. Seemed like this list doesn't archive attachments > (or has particularly good SPAM filter :-). I don't have the paper posted > on any site. Will send you a PDF (748 KB). If others want a copy, let me > know via email. > > Thanks! > > J. W. Hi Please send me a copy too, th