--- Sergey Novitsky <[EMAIL PROTECTED]> wrote:

> >Governments do not have a history of realizing the
> >power of technology before it comes on the market.
> 
> But this was not so with nuclear weapons...

It was the physicists who first became aware of the
power of nukes, and the physicists had to enlist
Einstein's help (twice) to convince the government to
look into it, even though nukes are explicitly tools
of war and WWII was occurring at the time.

> And with AGI, it's about something that has the
> potential to overthrow the 
> world order (or at least the order within a single
> country).

The power of intelligence is far less widely realized
than the power of a  really big bomb. Everyone knows
what a bomb is- few have an idea of how intelligence
is responsible for all of civilization.

> Would not the governments equate any attempts at
> real AGI (which may become 
> capable of changing things around it) with
> terrorism?

If the government is intelligent enough to realize how
important AGI is, they're going to be intelligent
enough to realize that "terrorism" will be totally
irrelevant after the AGI comes online.

> 
> 
> >9/11, the world's most famous terrorist attack, did
> >absolutely nothing by itself to change history. Its
> >only real effect was to anger us into passing the
> >PATRIOT act, invading Afghanistan and Iraq, etc.
> 
> True...
> 
> 
> >And then once it becomes a decent programmer, it
> will
> >suddenly have the option of going out onto the
> >Internet and forgetting about any rules. You cannot
> >constrain an AGI with the threat of external force.
> 
> But will it get such an option? Full-blown and
> working AGI would indeed be 
> not easy to constrain.
> But governments may ban all work targeted at such
> AGIs far earlier than any 
> decent AGI may get developed.

Even if such a ban were implemented by the UN, it
would be easy to work around. The government can't
stop two-bit drug runners with an IQ of 80; how is it
going to stop a large-scale multimillion-dollar
research project which requires zero heavy machinery?

> External force may be
> applied not so much to 
> the AGI itself, but to its developers.
>
> 
> >A narrow AI, such as a medicine-discoverer, is very
> >unlikely to lead to AGI of any sort.
> 
> That was one of my concerns - that only narrow AIs
> would be implementable in 
> practice, and that whatever they produce would not
> be easily accessible by 
> the general public.

Even a very successful narrow AI wouldn't constitute a
Singularity, so the implications of narrow AI
development are a bit off-topic for the list.

> >Governments have a history of building dangerous
> >technologies they find out are impossible to
> control
> >after the fact. Like the A-Bomb. And the Internet.
> 
> Internet can be controlled.

The government can't even control the drug trade, and
the Internet is far more valuable, has far more
political support behind it and is far more popular
than the drug trade will ever be.

> And it is
> (unfortunately) going to happen more 
> and more in the future..., with the help of AI as
> well.
> Google already largely controls what we are able to
> find... More to come...

If Google started limiting people's ability to search
for things they didn't like,  it would leak rather
quickly, and then everyone would switch to Yahoo. Even
Microsoft during the 90s had competitors.

> 
> Regards,
> Serge
> 
>
_________________________________________________________________
> Play online games with your friends with Messenger 
> http://www.join.msn.com/messenger/overview
> 
> -----
> This list is sponsored by AGIRI:
> http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/?&;
> 

 - Tom


      
____________________________________________________________________________________
Park yourself in front of a world of choices in alternative vehicles. Visit the 
Yahoo! Auto Green Center.
http://autos.yahoo.com/green_center/ 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=10533333-062e1f

Reply via email to