On 28 Nov 2017, at 18:49, Jason Resch wrote:
On Tue, Nov 28, 2017 at 11:32 AM, Bruno Marchal <marc...@ulb.ac.be>
wrote:
This question is more interesting. I tend to fall in the camp that
we exercise little control over the ultimate decision made by such
a super intelligence, but I am optimistic that a super intelligence
will, during the course of its ascension, discover and formalize a
system of ethics, and this may lead to it deciding not to wipe out
other life forms. For example, it might discover the same ideas
expressed here ( https://www.researchgate.net/profile/Arnold_Zuboff/publication/233329805_One_Self_The_Logic_of_Experience/links/54adcdb60cf2213c5fe419ec/One-Self-The-Logic-of-Experience.pdf
) and therefore determine something like the golden rule is
rationally justified.
I will take a look. I read him in Mind's I. What is the golden rule?
I am sure you have heard of it in concept, perhaps this name for it
is specific to english, but it is defined as:
The Golden Rule (which can be considered a law of reciprocity in
some religions) is the principle of treating others as one would
wish to be treated. It is a maxim of altruism that is found in many
religions and cultures.
OK, I did. With Mechanism, the Golden Rule is more like "don't treat
others as *they* do not want to be treated", and especially not "as
one wish to be treated", because this encourages thinking at the place
of the others. The golden rule is simply listen when people say "no",
and is more related to mutual consent, than projecting oneself on the
others and thinking like if they were similar.
Basically, treating others as self. If ethics is a field with
universal and objective answers, then I think a superintelligence
will discover those answers.
I am far from sure on this. Super-competent machine will exploit the
others with such a rule, and decide for themselves only in your name.
There will say that "It is for your good", without listening to you.
I think,
Bruno
Jason
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it,
send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.
http://iridia.ulb.ac.be/~marchal/
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.