The world in its pre-technological state though not an intelligence
is basically unFriendly. It is not out to get you, but you can die in this
world from all sorts of causes, like not getting oxygen. However, because we
evolved in the world, we are in the amazingly improbable state of usually
being able to survive for a few decades in this unFriendly environment.

Taking into account the world's technology, as you do in this email, the
world is  much Friendlier, since the technology was developed to achieve
human goals, i.e. usually your goals. You live longer and get more of what
you want.

As it improves, this will be more and more true unless  a weapon-like
technology helps a user achieve his human goals, but too effectively, thus
overriding your goals or a technology stops helping us achieve our goals
because of a bug or mistake, e.g.,  out-of-control nanotch.

The world's technology is still not intelligent,  but it does incorporate
human intelligence. It is "indifferent" as you mention, since it does not
have a personality, but a superAGI might  also not have a personality as we
understand it. As long as the technology helps us with your goals, we'd have
to call it Friendly.

Joshua

2007/7/14, Stathis Papaioannou <[EMAIL PROTECTED]>:

Despite the fact that it seems to lack a single unified consciousness
the world of humans and their devices behaves as if it is both vastly
more intelligent and vastly more powerful than any unassisted
individual human. If you could build a machine that ran a planet all
by itself just as well as 6.7 billion people can, doing all the things
that people do as fast as people do them, then that would have to
qualify as a superintelligent AI even if you can envisage that with a
little tweaking it could be truly godlike.

The same considerations apply to me in relation to the world as apply
to an ant relative to a human or to humanity relative to a vastly
greater AI (vastly greater than humanity, not just vastly greater than
a human). If the world decided to crush me there is nothing I could do
about it, no matter how strong or fast or smart I am. As it happens,
the world is mostly indifferent to me and some parts of it will
destroy me instantly if I get in their way: if I walk into traffic
only a few metres from where I am sitting. But even if it wanted to
help me there could be problems: if the world decided it wanted to
cater to my every command I might request paperclips and it might set
about turning everything into paperclip factories, or if it wanted to
make me happy it might forcibly implant electrodes in my brain. And
yet, I feel quite safe living with this very powerful, very
intelligent, potentially very dangerous entity all around me. Should I
worry more as the world's population and technological capabilities
increase further, rendering me even weaker and more insignificant in
comparison?



--
Stathis Papaioannou

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=22307053-3fd9f2

Reply via email to