On Thursday, June 03, 2021, at 7:25 PM, Matt Mahoney wrote:
> We already know how to engineer empathy. We do this all the time. It's called 
> user friendliness. The software anticipates what we will want and does the 
> right thing. We even invent new symbols to do it, like menus, icons, and 
> touch screen gestures.

Yes, it’s true some empathy can be “hardcoded”. I don’t pursue this for the 
empathy piece. I believe there are more directly relevant components for AGI at 
this layer. But, is it less dangerous to not have real conscious empathy? 
Possibly since then the empathy would be managed indirectly by people/engineers 
like us 😊 Just being conscious doesn’t automatically create empathy. Animals 
eat other animals alive apparently without any second thoughts… and what do 
people do to each other? The horrid things some societies did… or do. hmmm… 
there are many other layers to this consciousness thing.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T06c11d0b87552585-M4ae8da8b20b92bd7dfbd13c2
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to