Matt: What if you didn't program a robot to desire its various freedom or leisure, but instead, they became sentient, and decided on their own that they wantfreedom, leisure, monetary compensation, and rights? What would you do then? Destroy them? ~PM ------------------------------------------------------------------------------------------------------------------------------------------------ > Date: Sat, 26 Jan 2013 20:38:55 -0500 > Subject: Re: [agi] Robots and Slavery > From: [email protected] > To: [email protected] > > On Sat, Jan 26, 2013 at 3:46 PM, Piaget Modeler > <[email protected]> wrote: > > http://transhumanity.net/articles/entry/robots-and-slavery-what-do-humans-want-when-we-are-masters > > > > What do we do when robots begin to demand a living wage for their labour? > > Or when they refuse to obey? > > > > Reprogram them? Not when they are developmental robots (trained instead of > > programmed). > > The goal of AI is to build machines that can do everything that a > human could do. That is not the same thing as building an artificial > human. Why would you program a robot with human weaknesses and > emotions in the first place? > > -- > -- Matt Mahoney, [email protected] > > > ------------------------------------------- > AGI > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/19999924-5cfde295 > Modify Your Subscription: https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com
------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
