On 01/29/2013 10:03 AM, Matt Mahoney wrote:
What is so hard about *not* programming a robot to have human
emotions? It seems like a much easier problem to me if you don't
program it to not want to do what you tell it.
--
-- Matt Mahoney, [email protected]
Seems like an odd view from you Matt. You often speak about "knowing
what you want..." and the extensive surveillance data. Bottom line is
that we know what we want by our emotional acceptance. If these "know
it all" servants will know us, they will need to be experts on emotion.
I cringe at the thought of a bunch of critters running around that think
they know what I want. Nothing frustrates like a software program that
goes overboard on anticipating your every move - give me a little dumber
program and I won't have to fight it.
How smart is the intelligence if does what I tell it?
< no need to get personal here :) >
I want it to do what it has determined to be best - even if it came to
that conclusion without my blessing. And, yes, this creates problems.
The big problem is that these machines will be strong in the sense of
powerful. How do you control a kid that is stronger than you and has
poor judgment? If it doesn't have judgment, is it intelligent?
Stan
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com