Hi,

On Ben's essay: Ben is arguing that due to incomputable complexity
'friendliness' can only be guranteed under unsatisfactory narrow
circumstances. Independent of one agrees or not, it would follow that
if this is the case then substituting friendliness with one or all of
the alternative goals proposed by Ben namely "compassion", "growth"
and "choice" would not make a difference as the same incomputability
problem applies to all goals.

This doesn't quite summarize my current views accurately.

We know the world is complex, but we don't know exactly **how complex**

The dynamical complexity of the world means that, the more finely and
exactly one wants to control the future of the world, the harder it's
going to be.

This means that the odds of success are greater with more
coarsely-defined properties.

Something as finely and fussily defined as a particular human's notion
of Friendliness is very unlikely to be predictable to any useful
degree of accuracy.

OTOH, something like "not annihilating all matter in the universe" is
a lot coarser and one is more likely to be able to take steps to
increase the odds that it comes about.

My argument -- and it is not a fully rigorous one, though it is
founded in the known science of complex dynamical systems -- is that,
the coarser the goal one posits, the more likely it is that one will
be able to meaningfully and significantly influence the extent to
which the universe achieves this goal.

And my suggestion is that "Friendliness to Humans", however one
specifies it, is very likely to be a sufficiently fine-grained goal
that influencing the degree to which the universe achieves it will be
very, very difficult (almost surely impossible in practice).

But it may be that more general goals like "compassionateness" are
sufficiently coarse-grained that they are within the scope of what can
be pragmatically influenced via choices about AGI design and other
aspects of Singularity launch.

-- Ben G

-------
AGIRI.org hosts two discussion lists: http://www.agiri.org/email
[singularity] = more general, [agi] = more technical

To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to