On 19 Sep 2015, at 19:41, John Clark wrote:

On Sat, Sep 19, 2015  Bruno Marchal <marc...@ulb.ac.be> wrote:

​> ​You need an infinitely expanding brain to live an eternity, if not, you will cycle.

​To prevent cycling a mind wouldn't need to be infinite just unlimited, whenever you start to run low you just add more memory banks, but at ANY given time the mind would be of only finite size.

So we agree.


​> ​You might count that as a kind of immortality though

​Some might, but I wouldn't.​

Oh, we agree on this too. Of course the machine can have no cycling extension somewhere in arithmetic.



​>​>>​ ​you will eventually die from some ​accident.  ​

​​>> ​That's why ​you'd need lots of backup copies stashed in lots of different places, and with ​nanotechnology that would be easy.​

​> ​Only if you suffer from some local attachment.

​But you don't, otherwise you'd become a different person every time you cross the room. And what exactly are your spatial coordinates, the place your brain is at or the place you are thinking about, that is to say the place you seem to be? Consciousness is all about seeming to be so I'd guess the second. ​

​​>> ​You will never die of illness if you have the ability to ensure that the atoms in your body always remain in their correct orientation, ​and with ​nanotechnology that would be easy.​

​> ​You need to bet on some level of description,

​You need to know how generic atoms should be placed in relation to other ​generic atoms.

That is a low level, but in principle we cannot be sure, may be we need the string level. You are just telling me your choice of level, but other people can make different choice.





​> ​then it is "easy" is very large sense of the term.

​And Nanotechnology is the ability to move individual atoms to different positions ​relative to other individual atoms. Doing this is easy scientifically, no new physics is required, but it is not easy technologically, at least not yet.

​​>> ​Uploading is very well defined

​> ​Once we bet on a description level.

​What other sort of bet did you have in mind? If you know the relative position of all the generic atoms in something there is no scientific reason you can't make a second one with​ generic atoms, although you may encounter technological difficulties.

Quite plausible, but from a logical pov, we cannot claim to be sure.




​> ​I would ask they take into account the glial cells.

​What difference does that make?? Glial cells are made of 20 different types of organelles just like neurons and just like all the other 200 different types of cells in the human body; and the only difference between one of those 20 different types of organelles and another is the relative position that generic atoms have with other generic atoms.


Yes, right, if you copy yourself at the atomic level, you will have the glial cells right, but some neurophysiologiqte have cmaim (they are changing their mind right now on this) that the glial cells have no other role that being some stuff to protect the neurons, ans so a doctor might, for reason of economy, only copy the neurons. Today, evidences accumulate that glial cells do communicate, between each others, and with the neurons, and that they might have some rôle in (chronical pain.

But we agree, if you chose the atomic level, that is very plausibly a good low (and thus expensive) level.



​>> ​it's just not achievable yet for technological not scientific or philosophical reasons: ​Uploading​ ​is the functional equivalent of a biological brain in electronic form. ​

​> ​Assuming computationalism

​And only a fool would not make that assumption.


Why? Mathematically, we can conceive a transfinity of weakening of comp. I agree that "omega-computatiionalism" (the usual one) is much plausible, and that there are no evidence for a weaker form of it, but when we do a theoy, we put *all* the assumption on the table. That includes ides like the fact that ((A & B) -> B), that x + 0 = x, and of course, computationalism is a much less obvious assumption than the preceding one.

To be sure, the G/G* theology remains correct for a vast variety of "non-machine", which also live in arithmetic, and I am not sure if they can access different "physics" than the omega-machine physics. But we can do the math and compare with nature.


​> ​With a bad choice, someone can believe having survived "rather well" for some weeks, and then realize that there is a problem, the long term memory is not handled well,

​If long term memory, ​or short term memory, or anything else is not working well then generic atoms have not been placed in the correct orientation relative to other generic atoms. And the exact same thing happens when your computer is not working well, or your can opener for that matter.

You can wish that, but you cannot pretend to know that. Maybe the brain needs dark matter. I don't find that palsuible, but sometimes agao, I would not have believed that drak matter is possible. But of course, you can do such assumption and say "yes" to the doctor, but you can't impose this to another, and treating him as fool does not ring right to me.






​> ​or a feeling that something is different

​The cause of that different feeling could only be that something IS indeed different, ​the arrangement of generic atoms must be different. Correct that error and put things where they are supposed to go and that unpleasant different feeling will go away.

​> ​but they can't figure it out,

​Then they're dumb. ​

...or they have understood something that we don't have (yet) understood ourself. I mean that *certainty* in such matter is ... a priori dumb.

Bruno





 John K Clark



--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to