How can we design AI so that it won't wipe out all DNA based life, possibly
this century?

That is the wrong question.  I was reading
http://sl4.org/wiki/SoYouWantToBeASeedAIProgrammer and realized that (1) I am
not smart enough to be on their team and (2) even if SIAI does assemble a team
of the world's smartest scientists with IQs of 200+, how are they going to
compete with a Jupiter brain with an IQ of 10^39?  Recursive self improvement
is a necessarily evolutionary algorithm.  It doesn't matter what the starting
conditions are.  All that ultimately matters is the fitness function.

The goals of SIAI are based on the assumption that unfriendly AI is bad.  I
question that.  "Good" and "bad" are not intrinsic properties of matter. 
Wiping out the human race is "bad" because evolution selects animals for a
survival instinct for themselves and the species.  Is the extinction of the
dinosaurs bad?  The answer depends on whether you ask a human or a dinosaur. 
If a godlike intelligence thinks that wiping out all organic life is good,
then its opinion is the only one that matters.

If you don't want to give up your position at the top of the food chain, then
don't build AI.  But that won't happen, because evolution is smarter than you
are.

I expressed my views in more detail in
http://www.mattmahoney.net/singularity.html
Comments?


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=70862924-edee6f

Reply via email to