I wonder if this is a substantive difference with Eliezer's position though, since one might argue that 'humanity' means 'the [sufficiently intelligent and sufficiently ...] thinking being' rather than 'homo sapiens sapiens', and the former would of course include SAIs and intelligent alien beings.

Eli is quite clear that AGI's must act in a Friendly fashion but we can't expect humans to do so. To me, this is foolish since the attractor you can create if humans are Friendly tremendously increases our survival probability.

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to