On 17 Sep 2015, at 17:53, John Clark wrote:


On Wed, Sep 16, 2015  Brent Meeker <meeke...@verizon.net> wrote:

​​>>​ If you knew you were immortal why on earth would you ​ be risk averse?

​> ​Not having a built in biological life span is very different from being immortal. Immortal means you can't die.

Immortality isn't a deep concept, it just means making sure the atoms in your biological brain, or its functional equivalent, always remained in their correct orientation. Immortality is simply a ​ matter of ​maintaining ​organization,​ and with nanotechnology that would be easy.​

You need an infinitely expanding brain to live an eternity, if not, you will cycle. You might count that as a kind of immortality though, but then we must introduce distinction and make more precise what we mean by that.






​> ​you will eventually die from some ​accident.  ​

​That's why ​you'd need lots of backup copies stashed in lots of different places, and with ​nanotechnology that would be easy.​


Only if you suffer from some local attachment. If not "you" are already distributed in infinitely many "regions" of the tiny sigma_1 reality.





​> ​or illness.

​You will never die of illness if you have the ability to ensure that the atoms in your body always remain in their correct orientation, ​and with ​nanotechnology that would be easy.​

You need to bet on some level of description, then it is "easy" is very large sense of the term. But that will be done, no doubt, at many different levels, and some people will treat higher level people as zombies, etc.






​> ​Uploading" isn't some well defined process

​Uploading is very well defined,

Once we bet on a description level. I would ask they take into account the glial cells. I would ask for a package for the nervous system in the belly which is more dense than we thought.




it's just not achievable yet for technological not scientific or philosophical reasons: ​Uploading ​is the functional equivalent of a biological brain in electronic form. ​


Assuming computationalism and, in practice, the correctness of the choice of the substitution level.

With a bad choice, someone can believe having survived "rather well" for some weeks, and then realize that there is a problem, the long term memory is not handled well, or a feeling that something is different but they can't figure it out, etc. It can be like an altered state of consciousness, and there are infinities of possibilities.




​> ​You could be "uploaded" today by having a team of people research your appearance, personality, thinking, preferences, speech, etc. and incorporating them into a computer program with sensory inputs and some Watson like AI. It would produce a Max Headroom like John Clark who would continue to berate Bruno for his use of pronouns and other signs of intelligence.

​Intelligent behavior is a much deeper property than consciousness, so if it's got John Clark's intelligence (or better) that's good enough for me. ​


lol





​> ​Would it be conscious?...who knows.

​That is nothing new, that is the same sort of uncertainty every human being who has ever lived must face. Was the original John Clark conscious? Only the original John Clark knows for sure.

​> ​Would it be recognizably John Clark...sure.

​That is good enough for me.


Hmm... you betray your attraction for some first person elimination.
At least that is coherent with your belief in primary matter.


Bruno




  John ​K Clark







--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to