On 01/07/07, Sergey A. Novitsky <[EMAIL PROTECTED]> wrote:

If AI is going to be super-intelligent, it may be treated by governments as
some sort of super-weapon.
As it already happened with nuclear weapons, there may be treaties
constraining AI development.

Nuclear weapons need a lot of capital and resources to construct,
which is why they can be semi-regulated, at least for individuals. AI
may initially require supercomputers to run, but once   home computers
catch up to these supercomputers, stopping it would be like trying to
stop file sharing and software piracy.



--
Stathis Papaioannou

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to