On Sunday, February 17, 2013 1:11:05 PM UTC-5, Bruno Marchal wrote:
>
>
> On 15 Feb 2013, at 22:14, Craig Weinberg wrote:
>
>
>
> On Thursday, February 14, 2013 11:20:12 AM UTC-5, Bruno Marchal wrote:
>>
>>
>> On 13 Feb 2013, at 23:37, Stephen P. King wrote, to Craig Weinberg
>>
>> Baudrillard is not talking about consciousness in particular, only the 
>> sum of whatever is in the original which is not accessible in the copy. His 
>> phrase 'profound reality' is apt though. If you don't experience a profound 
>> reality, then you might be a p-zombie already.
>>
>>
>>
>>     Right!
>>
>>
>>
>> Right?
>>
>> Here Craig is on the worst slope. It looks almost like " if *you* believe 
>> that a machine is not a zombie, it means that you are a zombie yourself".
>>
>
> No, I was saying that if you don't believe that your own experience is 
> profoundly real, then you are a zombie yourself.
>
>
> I remain anxious because you seem to believe that a computer cannot 
> support a profoundly real person experience.
>

I don't think that it can unless it is made of living beings, who are the 
baton holders if you will of a biological history that is grounded in the 
catastrophe of vulnerability that those experiences are composed of.

The bits of the computer which are not assembled - the silicon and plastic 
substance, does have an experience, but not as a person or animal or even 
bacteria.  Without that history being embodied physically, I don't expect 
that it has any resources to draw upon with which to feel 'profound' 
realism in the way that we feel it, and other animals. The sense is that 
vegetables do not have the same sort of realism in their experiences as 
animals when we kill them and eat them, and even if that is untrue, our 
humanity and sanity may depend on believing the lie on some level. I think 
that it is probably not a lie though, and our intuition is not completely 
wrong about the sliding scale of quality in the natural world. We don't see 
the vegetable equivalent of primates. Maybe there's a reason?


>
>
>  
>
>>
>> They will persecuted the machines and the humans having a different 
>> opinion altogether.
>>
>> Craig reassure me. he is willing to offer steak to my sun in law (who get 
>> an artificial brain before marriage).
>>
>> But with Baudrillard, not only my sun in law might no more get his 
>>  steak, but neither my daughter! Brr...
>>
>
> Hahaha. How about your son in law gets a simulation of steak which is 
> beneath his substitution level? 
>
>
> He will be completely satisfied. Thanks for him.
>  
>

>
>
> Even better, I just hack into his hardware and move one of his memories of 
> eating steak up on the stack so it seems very recent. 
>
>
> Again, he will be completely satisfied. But my daughter will be sad, as 
> she want to enjoy eating the meal together with him. 
>


That's good that your position is consistent. Why have a universe at all 
though? Why not just have a memory of it?
 

>
>
>
>
> Is your brother in law racist against simulated steaks as memory implants?
>
>
>
> Not at all. Since he got an artificial brain, he uploaded already many 
> entire lives from the CGSN (Cluster-Galactica-Super-Net), and I have to ask 
> him to restrain himself, as I am the one paying the bill :)
>
> You know, in 43867 after JC, they will succeed in recovering the 
> brain-state of any existing human states, just by looking of the tiny 
> actions of their brain on the environment. We always leave traces.
> You will be download, for the first time, in 44886, for example. It is bad 
> news, as all the humans having existed before 33000 (+/-) will be freely 
> downloadable. After that date, most humans will got sophisticated quantum 
> keys protecting them from such possible futures. That why some researcher 
> will say, that with comp, we have the solution of who go in hell and who go 
> in heaven. All humans having live before 33000 go to hell, and all the 
> infinitely many others go to heaven. Of course this is still a rather gross 
> simplification, and it concerns only the minority who want explore and 
> pursue the Samsara exploration.
>

Nice. Or maybe by 2200 we can just simulate the brain state of someone who 
would be alive in that era and save ourselves 30 or 40000 years. 

Craig


> Bruno
>
>
>
>
> Craig
>
>
>> Bruno
>>
>>
>>
>> http://iridia.ulb.ac.be/~marchal/
>>
>>
>>
>>
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to everything-li...@googlegroups.com <javascript:>.
> To post to this group, send email to everyth...@googlegroups.com<javascript:>
> .
> Visit this group at http://groups.google.com/group/everything-list?hl=en.
> For more options, visit https://groups.google.com/groups/opt_out.
>  
>  
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to