If I understand the singularity correctly it will result in humans
having greatly expanded and enhanced information processing capabilities
(Human + Machine).  Assuming these new capabilities will ultimately
result in higher levels of consciousness, then, can we say that the
singularity will produce a new "species" of supersapients?  What if the
supersapient community that emerges merges their noosphere to produce
and enforce regulations that we interpret as "evil" and they interpret
as "good?"  But then we dont need to be supersapient to know that when
speaking of good and evil perspective is fundamental.
 
Andrew Yost, PhD
Forest Ecologist
Oregon Dept. of Forestry
Salem, OR 97310
503-945-7410

  _____  

From: Artificial Stupidity [mailto:[EMAIL PROTECTED] 
Sent: Monday, September 24, 2007 1:12 PM
To: singularity@v2.listbox.com
Subject: Re: [singularity] Benefits of being a kook



Who cares? Really, who does?  You can't create an AGI that is friendly
or unfriendly.   It's like having a friendly or unfriendly baby.    How
do you prevent the next Hitler, the next Saddam, the next Osama, and so
on and so forth?   A friendly society is a good start.   Evil doesn't
evolve in the absence of evil, and good doesn't come from pure evil
either.   Unfortunately, we live in a world that has had evil and good
since the very beginning of time, thus an AGI can choose to go bad or
good, but we must realize that there will not be one AGI being, there
will be many, and some will go good and some will go bad.   If those
that go bad are against human and our ways, the ones that are "good",
will fight for us and be on our side.   So a future of man vs machine is
just not going to happen.   The closest thing that will happen will be
Machines vs (Man + Machines).   That's it.  With that said, back to
work! 


On 9/22/07, Derek Zahn <[EMAIL PROTECTED]> wrote: 

        This message is "semi-serious".
         
        The latest SIAI blog laments the apparently dismissive attitude
of mainstream media toward  the singularity summit (and presumably the
concept in general, and SIAI itself by  extension).  Maybe it's not the
worst thing thing that could happen. 
         
        Consider the war in Iraq (oops, I just lost half my readers!
But this is not a political  tirade, it's about AGI):  The "reason" for
this war, in my opinion, is to establish a base from which the USA can
exert social, cultural, economic, and military pressure on people  who
might use nasty weapons against the USA or its friends.  Whether such a
project is noble or effective is unimportant.  What is important is that
the USA is so scared of  having our people and stuff blown up that we'll
spend a trillion dollars and thousands of lives on a rather speculative
strategy for fighting the threat. 
         
        Now our little gang is basically saying that AGI is WAY more
dangerous than any little  nuclear bomb or other WMD.  Thank the AGI and
Bayes its prophet that they think we're  kooks, they'd shut us down in a
heartbeat if they didn't! 
         
        Can they?  As an arbitrary thought experiment, let's say that a
beyond-human AGI can be built on a 1000-pc cluster.  Modern computer
chips are incredibly complicated devices that  can only be produced in
massive high-tech fabrication facilities.  I could easily imagine the
government attempting to regulate these plants and their products like
any other  hazardous but useful substance, and bombing fabs if they are
constructed in North Korea or  Iran.  Controlling proliferation of
radioactive material in this way has been at least  somewhat effective,
and maybe spending a trillion dollars in an effort to do the same thing
to CPUs could seem to powerful people to be a good idea, especially if
the  threat is not only physical but also spiritual. 
         
        That doesn't stop Russia or China etc from building AGI, so I
suppose we'd also have treaties to prevent AGI development that we'd
secretly cheat on, so all of us will end up  in windowless cinderblock
cubicles in Los Alamos. 
         
        Now let's follow up on the recent speculation on the AGI list
that a cheap laptop is  actually enough processing power.  In that case,
the hardware restriction policy would be  necessary but also too late.
AGI work itself can still be banned.  What sort of additions  to the
Patriot Act would be needed to make sure that we are not working on AGI
in secret? 
         
        Also in this case, amusingly, the well-publicised effort to make
sure every kid on the  planet has a cheap laptop is basically making
sure that every kid on the planet has  something worse than a nuclear
bomb kit.  Maybe all those kids are too dumb to figure out how to
assemble it. 
         
        Next, consider religious fundamentalists.  Those people are able
to follow a chain of  reasoning that leads them to blow up abortion
clinics, marketplaces, and fly airplanes  into buildings to protect
their points of view.  AGI and the singularity are much larger  threats
to their world view than any current target.  How attractive a bomb
target is the singularity summit itself or an artificial intelligence
conference?  Thank the AGI and  Bayes its prophet that they think we're
kooks, they'd kill us if they didn't! 
         
        Why do we care whether the world thinks we're kooks or not?
         
        1) We want to beg for money, and people don't give money to
kooks.  Fair enough, but another approach that good true ideas with
economic value can take is to earn money instead by selling people
things with value. 
         
        2) If we "raise awareness", perhaps a better-informed "common
man" will help make a "positive" singularity more likely.  It's
possible.  Getting more people who think technically for a living
(scientists, engineers) convinced could also be beneficial (in case us
believers don't have the right answers yet and aren't going to find them
soon).  If those people's opinions are driven by what they see on tv
news or the wall street journal, the scent of kookery is not too
helpful. 
         
        3) Bloggers and websites are successful in proportion to the
number of hits they get, and kooks don't get many hits.
         
        Any other good reasons we should care whether journalists heap
scorn on our efforts? 
        
        
  _____  

        This list is sponsored by AGIRI: http://www.agiri.org/email
        To unsubscribe or change your options, please go to: 
        http://v2.listbox.com/member/?&; 


  _____  

This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;
2

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=45432568-59f2bc

Reply via email to