On Sun, Jan 27, 2013 at 4:43 PM, Piaget Modeler
<[email protected]> wrote:
> I asked "What do we give robots when they ask for rights?"   I mean, even 
> animals have rights (PETA).
> Why shouldn't robots?

Why would robots ask for rights unless we program them to ask for rights?

Our goal is to make machines smart enough to do all the work that we
would otherwise need to pay people to do. It means solving hard
problems in language, vision, art, and robotics. It means being able
to predict human behavior, including recognizing human emotions and
knowing their causes and effects. It does not mean building a machine
that would have human emotions or human goals.

With sufficient technology, computing power, and human knowledge, we
could, if we wanted to, build robots that look and convincingly behave
like humans. We would be tempted to give them human rights. That would
be a mistake. Such robots would compete with us for scarce resources
such as energy, raw materials, and space for living and waste
disposal. These resources will remain scarce even with AI and
nanotechnology. Furthermore, these robots will be stronger than
humans, will know more, think faster, and reproduce faster. They could
be programmed (accidentally, maliciously, or by evolution) not to care
about human rights in such a way that we could not tell until it was
too late. They would know and could take advantage of our ethical
concerns for other humans or things that resemble humans. The fact
that you raise such questions is evidence for this risk.

--
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to