--- Thomas McCabe <[EMAIL PROTECTED]> wrote:
> On Nov 30, 2007 11:11 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > How can we design AI so that it won't wipe out all DNA based life,
> possibly
> > this century?
> >
> > That is the wrong question. I was reading
> > http://sl4.org/wiki/SoYouWantToBeASeedAIProgrammer and realized that (1) I
> am
> > not smart enough to be on their team and (2) even if SIAI does assemble a
> team
> > of the world's smartest scientists with IQs of 200+, how are they going to
> > compete with a Jupiter brain with an IQ of 10^39? Recursive self
> improvement
> > is a necessarily evolutionary algorithm.
>
> See http://www.overcomingbias.com/2007/11/no-evolution-fo.html.
So is it possible to have RSI without ALL of the following?
* Entities that replicate
* Substantial variation in their characteristics
* Substantial variation in their reproduction
* Persistent correlation between the characteristics and reproduction
* High-fidelity long-range heritability in characteristics
* Frequent birth of a significant fraction of the breeding population
* And all this remains true through many iterations
The non-computability of proofs of complex systems seems to force us into an
experimental approach, making modifications to existing designs and testing
them because we don't know in advance if the changes will work as planned. Is
there another approach?
> > It doesn't matter what the starting
> > conditions are. All that ultimately matters is the fitness function.
>
> This is precisely why evolutionary methods aren't safe. Also see
> http://www.overcomingbias.com/2007/11/conjuring-an-ev.html
But what is the alternative?
> > The goals of SIAI are based on the assumption that unfriendly AI is bad.
> I
> > question that. "Good" and "bad" are not intrinsic properties of matter.
> > Wiping out the human race is "bad" because evolution selects animals for a
> > survival instinct for themselves and the species. Is the extinction of
> the
> > dinosaurs bad? The answer depends on whether you ask a human or a
> dinosaur.
> > If a godlike intelligence thinks that wiping out all organic life is good,
> > then its opinion is the only one that matters.
>
> Uh, yes. I see this as a bad thing- I don't want everyone to get
> killed. See http://www.overcomingbias.com/2007/11/terrible-optimi.html,
> http://www.overcomingbias.com/2007/05/one_life_agains.html,
> http://www.overcomingbias.com/2007/11/evolving-to-ext.html.
Evolution is a critically balanced system on the boundary between stability
and chaos. Stuart Kauffman studied such systems, which also include complex
software systems, gene regulation networks, and randomly connected logic gates
with an average fan in/fan out at a critical value between 2 and 3. In analog
systems, we say a system is critically balanced if its Lyapunov exponent is 0.
A characteristic of such systems is that it is usually stable against
perturbations of the system state, but occasionally a small change can cause
catastrophic results. Evolution is punctuated by mass extinctions on the
boundaries between geologic eras. We are in one now.
> > If you don't want to give up your position at the top of the food chain,
> then
> > don't build AI. But that won't happen, because evolution is smarter than
> you
> > are.
>
> This isn't true: see
> http://www.overcomingbias.com/2007/11/the-wonder-of-e.html,
> http://www.overcomingbias.com/2007/11/natural-selecti.html,
> http://www.overcomingbias.com/2007/11/an-alien-god.html,
> http://www.overcomingbias.com/2007/11/evolutions-are-.html.
Evolution appears stupid because it is slow. On our time scale it seems to
backtrack endlessly from pointless dead ends. But ultimately it succeeded in
creating humans. RSI will be much faster.
-- Matt Mahoney, [EMAIL PROTECTED]
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=71191395-cafcfe