Re: Uploaded Worm Mind
On 9/4/2015 7:35 AM, Bruno Marchal wrote: On 03 Sep 2015, at 20:26, meekerdb wrote: On 9/3/2015 8:35 AM, Bruno Marchal wrote: On 02 Sep 2015, at 22:48, meekerdb wrote: On 9/2/2015 8:25 AM, Bruno Marchal wrote: So now you agree with me that there are different kinds and degrees of consciousness; that it is not just a binary attribute of an axiom + inference system. ? Either you are conscious, or you are not. But is a roundworm either conscious or not? an amoeba? I don't know, but i think they are. Even bacteria, and perhaps even some viruses, but on a different time scale than us. If they can be conscious, but not self-conscious then there are two kinds of "being conscious". Yes, at least two kinds, but each arithmetical hypostases having either "<>t" or "& p" describes a type of consciousness, I would say. And they all differentiate on the infinitely many version of "[]A", be it the "[]" predicate of PA, ZF, an amoeba or you and me ... So if there are different kinds of consciousness then a being with more kinds is more conscious. It seems that your dictum, "Your either conscious or not." is being diluted away to mere slogan. There are basically two levels, without criterion of decidability, but with simple operational definition: 1) something is conscious if it is torturable, and arguably ethically wrong of doing so. So when Capt Sequra tells Wormold that he's "not of the torturable class" he means he's not conscious. :-) You might need to give some references here, I'm afraid. It's from "Our Man In Havana" by Grahame Green. Only poor Cubans are in the torturable class, not Englishmen. How is this an operational defintion? What is the operation to determine whether a being is torturable? Yu make the torture publicly, and if you are sent to jail, the entity is conscious, at least in the 3-1 view of the people you are living with. You mean the people who sent me to jail are conscious, i.e. they have empathy which implies they are conscious. But that doesn't really solve the problem. They might just be pretending empathy. And it doesn't help with my design of a Mars Rover. Will it be conscious only if I program it to show empathy when another Mars Rover is tortured? Does a jumping spider show empathy when a fly is tortured, or only when another jumping spider is tortured? I think all invertebrates are already at that level, and in arithmetic that might correspond to the sigma_1 complete (Turing universality). Robinson Arithmetic, the universal dovetailer, are at that level. 2) something is self-conscious if it is Löbian, basically he is aware of its unnameable name. PA, ZF, are "at that level", like all their sound recursively enumerable extensions. At that level, the entity is able to ascribe consciousness to another, and can get the the moral understanding of good and wrong (with or without a forbidden fruit). What's the operation to determine it is aware of its unamable name? Ok, you torture a fellow, now, and all people complaining about this can be said to have the ability to ascribe consciousness to others. In principle you have to repeat this often to avoid the partial zombie case. The criteria are operational in the weak sense of making the statement plausible, as we know already that there is no definite criterion for consciousness. We might not been able to convince an alien about this. Essentially you are saying just rely on your intuition about what's conscious and what's not. But as Scott Aaronson point out we seek a /*theory*/ of consciousness that we can apply to machines and aliens where our intuition doesn't work. Brent -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.
Re: 1P/3P CONFUSION again and again
On Fri, Sep 4, 2015 at 10:53 AM, Bruno Marchal wrote: >> >> >> If arithmetic is more fundamental than physics as you say then we should >> be able to write a program that would get the computer wet, and yet we >> can't and your theory can not give an adequate explanation of why not. > > > > > you need to define what you mean by wet. > No I most certainly do not need to do that! Any definition of wet that I give would be made of words and I have no doubt you would then demand another definition of at least one of those words which I could only provide with yet more words and round and round we go. It would be much better if I gave an example not a definition, it would be much better if I threw a bucket of water at you then pointed at you and said "wet". >> >> >> Computationalism >> postulates that the computations a* PHYSICAL* computer produce can >> create intelligent behavior and consciousness, but computationalism does >> *NOT* postulate that >> computations exist in arithmetic >> independent of physics. >> > > > > The fact that computations exist in arithmetic is a trivial theorem. > You keep saying that, and yet in spite of the fact that it would be trivial for you to do so you have been unable to explain why you have not started The Marchal Computer Hardware Company and you have been unable to explain why you are not a trillionare. > >> >> Show me a example of arithmetic all by itself making a calculation and >> you have won this argument, not a definition, not a proof, an *EXAMPLE*. >> Stop talking about it and just show me! > > > > > google a bit more on "Kleene predicate" > I don't want to google "Kleene predicate" and I don't want another "proof" and I don't want a definition!!! I want an *EXAMPLE*, I want to see you or anybody or anything else calculate 2 +2 without using matter! > > Or read any textbook, or Gödel's original paper > I don't want to read any textbook , I don't want to read Gödel's original paper ! I want an *EXAMPLE*, I want to see you calculate 2 +2 without using matter! >> >> Yes "you" will survive provided that "you" is defined as somebody who >> remembers being a man in Helsinki, > > > > > But that is ambiguous, because if the guy (who remembers being the man who > was in Helsinki) is now in both city, > YES, and that is exactly precisely why asking what one and only one city "you" will see in a world with "you" duplicating machines in it is not a question at all, it is gibberish. > > > You continue to introduce an ambiguity by ignoring the 1p/3p difference, > In the entire history of the world nobody, absolutely nobody, has ignored the difference between 1p and 3p. > > > we must still take into account the content of the 1p experiences, > Who's 1p experience? Mr. You's. And who is Mr. You? The guy with THE 1p experience . And round and round we go. > > > ignoring that when your body is in tow places, all your possible > subjective experiences' content mention only one place. > Who's subjective experiences are only in one place? Mr. You's. And who is . John K Clark > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.
Re: 1P/3P CONFUSION again and again
On 03 Sep 2015, at 18:56, John Clark wrote: Bruno Marchal wrote: > Just one remark: we cannot make a piece of matter wet in arithmetic I know, but why not? If arithmetic is more fundamental than physics as you say then we should be able to write a program that would get the computer wet, and yet we can't and your theory can not give an adequate explanation of why not. To do this, you need to define what you mean by wet. If it involve primary matter, then the theory explains why arithmetic cannot produce that, as there is no primary matter. If you define wet without using primary matter, then it depends on the definition you will give of wet. > but once we postulate computationalism, we can prove that all the piece of computations leading to the first person experience of feeling wet, or clenching your thirst, exist in arithmetic Computationalism postulates that the computations a PHYSICAL computer produce can create intelligent behavior and consciousness, but computationalism does NOT postulate that computations exist in arithmetic independent of physics. The fact that computations exist in arithmetic is a trivial theorem. You don't need to assume the "Yes Doctor" part of computationalism, but either Church's thesis, or Church's definition of computation, is enough to prove that. Show me a example of arithmetic all by itself making a calculation and you have won this argument, not a definition, not a proof, an EXAMPLE. Stop talking about it and just show me! This should help, if not google a bit more on "Kleene predicate" https://en.wikipedia.org/wiki/Kleene%27s_T_predicate Or read any textbook, or Gödel's original paper which does this for an important subclass, (the primitive recursive function) and then you can conceive the generalization. > I will please you and not use pronouns Bruno Marchal just did. > someone asked JC, before the duplication, what do you expect to live. JC remarked that "you" is ambiguous. Oh, but you agreed that you will survive, And JC responded: "Yes "you" will survive provided that "you" is defined as somebody who remembers being a man in Helsinki, But that is ambiguous, because if the guy (who remembers being the man who was in Helsinki) is now in both city, it is still true that bith the 3-he feels to be in one city from the 1p view. You continue to introduce an ambiguity by ignoring the 1p/3p difference, where we have insisted that we have to make it to address the question asked. but if that personal pronoun is defined in some other way That never happens. But once we have defined it, we must still take into account the content of the 1p experiences, given that the question bears on that (future) content. or, as often happens on this list, not defined at all then JC might have a different answer to "will you survive" or have no answer at all because gibberish has no answer". That is eminently true, but you are the one aking the question gibberish by ignoring that when your body is in tow places, all your possible subjective experiences' content mention only one place. >so you expect to live some experience, no? Explain what that GODDAMN personal pronoun "you" means and JC will provide an answer! Bruno's "I will please you and not use pronouns" promise sure didn't last long. It did, as "you" is used before the duplication, and you have agreed there is no ambiguity at that moment. Try better as you repeat the same old stuff which has been debunked by everyone since a long time. Bruno John K Clark -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout. http://iridia.ulb.ac.be/~marchal/ -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.
Re: Uploaded Worm Mind
On 03 Sep 2015, at 20:26, meekerdb wrote: On 9/3/2015 8:35 AM, Bruno Marchal wrote: On 02 Sep 2015, at 22:48, meekerdb wrote: On 9/2/2015 8:25 AM, Bruno Marchal wrote: So now you agree with me that there are different kinds and degrees of consciousness; that it is not just a binary attribute of an axiom + inference system. ? Either you are conscious, or you are not. But is a roundworm either conscious or not? an amoeba? I don't know, but i think they are. Even bacteria, and perhaps even some viruses, but on a different time scale than us. If they can be conscious, but not self-conscious then there are two kinds of "being conscious". Yes, at least two kinds, but each arithmetical hypostases having either "<>t" or "& p" describes a type of consciousness, I would say. And they all differentiate on the infinitely many version of "[]A", be it the "[]" predicate of PA, ZF, an amoeba or you and me ... So if there are different kinds of consciousness then a being with more kinds is more conscious. It seems that your dictum, "Your either conscious or not." is being diluted away to mere slogan. There are basically two levels, without criterion of decidability, but with simple operational definition: 1) something is conscious if it is torturable, and arguably ethically wrong of doing so. So when Capt Sequra tells Wormold that he's "not of the torturable class" he means he's not conscious. :-) You might need to give some references here, I'm afraid. How is this an operational defintion? What is the operation to determine whether a being is torturable? Yu make the torture publicly, and if you are sent to jail, the entity is conscious, at least in the 3-1 view of the people you are living with. I think all invertebrates are already at that level, and in arithmetic that might correspond to the sigma_1 complete (Turing universality). Robinson Arithmetic, the universal dovetailer, are at that level. 2) something is self-conscious if it is Löbian, basically he is aware of its unnameable name. PA, ZF, are "at that level", like all their sound recursively enumerable extensions. At that level, the entity is able to ascribe consciousness to another, and can get the the moral understanding of good and wrong (with or without a forbidden fruit). What's the operation to determine it is aware of its unamable name? Ok, you torture a fellow, now, and all people complaining about this can be said to have the ability to ascribe consciousness to others. In principle you have to repeat this often to avoid the partial zombie case. The criteria are operational in the weak sense of making the statement plausible, as we know already that there is no definite criterion for consciousness. We might not been able to convince an alien about this. Bruno Brent But the content of the consciousness can be extremely variable, and then there are many different types of consciousness states. By incompleteness, machine's psychology is transfinitely rich. The first person self is not a machine from the machine first person perspective. Machines are naturally non computationalist, and the origin of consciousness is plausibly more on the side of the truth than on the representation. Bruno -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout. http://iridia.ulb.ac.be/~marchal/ -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.