On Tue, Jun 9, 2015 at 1:48 PM, Telmo Menezes <te...@telmomenezes.com>
wrote:

>
>
> On Tue, Jun 9, 2015 at 7:28 PM, Terren Suydam <terren.suy...@gmail.com>
> wrote:
>
>> Perhaps most superintelligences end up merging into one super-ego, so
>> that their measure effectively becomes zero.
>>
>
> Perhaps, but I'm not convinced that this would reduce its measure.
> Consider the fact that you are no an ant, even though there are apparently
> 100 trillion of them compared to 7 billion humans.
>
> Telmo.
>
>

The way I resolve that one is to assume that self-sampling requires a high
enough level intelligence to have an ego (the 'self' in self-sampling).
This is required to differentiate the computational histories we identify
with as identity & memory.

Let's say the entirety of humanity uploaded into a simulated environment,
and that one day the simulated separation between minds was eradicated,
giving rise to a super-intelligence (just one path of many to a
superintelligence). From that moment on it would be impossible to
differentiate computational histories in terms of personal identity/memory,
so the measure goes to zero.

T

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to