On Friday, February 22, 2013 4:54:05 PM UTC-5, John Clark wrote: > > On Fri, Feb 22, 2013 at 8:25 AM, Craig Weinberg > <whats...@gmail.com<javascript:> > > wrote: > > >> What to you think with, your elbow? >>> >> >> > my point was that you have a double standard about which brain >> activities represent nothing but evolutionary driven illusions >> > > Illusions? Evolutionary drive is what made you the man you are today. And > interpreting a 1D signal from the eye as 3D space is as valid a > interpretation as any other, and apparently Evolution has determined that > particular interpretation gets the most genes into the next generation. > Thus you are good at 3D visualization because your ancestors were good at > it too. You come from a long line of winners, most animals never manage to > reproduce but every single one of your ancestors did. >
A successful evolutionary outcome doesn't have anything to do with the veracity of the content of a signal. If someone has a delusion that their ancestors are sacred turnip people and it causes them to plant turnips and survive a famine, that doesn't mean that their belief is not a delusion. There seems to be this theme with your positions which fanatically exaggerates the importance of winning, and how winning justifies whatever distortions of the truth are required...but then when it comes to science and math, there seems to be a different standard. > > and which ones represent an independent and absolute truth. >> > > Huh? 2+2=4 is as close to a absolute truth as I can think of, what double > standard are you talking about? > What's 2+2=4 other than an electrical reaction in your brain? > Evolution is like history, it could have been different, a very small > change in the distant past could cause gargantuan changes in the present, > is that what you mean? > No, I don't know why you're going over evolution101 with me. > > > A signal is a sign. >> > > I can't argue with that. > > > A sign means that it has to be interpreted by someone or some thing. >> > > Yes and in this case the brain is doing the interpretation, and electronic > cochlear implants can create a sequence of impulses that the brain > interprets as sound, and we're well on the way of doing the same thing with > 3D vision. > We don't really know that the brain is doing an interpretation, so much as a complex notification. The interpretation may not be local to the brain, but to the lifetime of the personal experience associated with the brain. Why would one part of the brain receive and encode information from the outside world only for another part of the brain to decode the same information for some artificial inner world? If the brain can interpret the outside world as code, surely it would remain as code - invisible, intangible, precisely transmitted information states. > > > Our experience of 3D images is not useful to the brain in any way. >> > > The 3D visualization of space would be very useful indeed if it's the most > efficient way to figure out how to jump out of the way when a saber toothed > tiger lunges at you on the African savanna. > But it could not be any more efficient than no presentation at all. Absolutely, clearly, and unarguably: not possible. > > > The electronic sequences need not be interpreted at all because they are >> already neurological signals. >> > > That statement is nuts. To a animal without genes for interpretation a > neurological signal is just a neurological signal and there would be no > reason to move when a predator starts to run at it > The reason would be the that they received a neurological to move - just like a computer does. IF TIGER = 1 THEN RUN. You really are not seeing that your legs are cut off here. It reminds me of the limbless knight in Holy Grail. You can't understand what I am talking about if you are unwilling or unable to imagine thought experiments in which the existence of consciousness *is not an option*. Once you can do that, you can see that what makes sense about consciousness in hindsight, makes absolutely no sense evolutionarily. Evolution is not going to invent geometry to make data look pretty if pretty is meaningless. The data is all you need. Just like a computer doesn't need to draw triangles in a DIMM to render images for us to see. > and the genes of that stupid animal would not make it into the next > generation; but a animal with genes for constructing a 3D world would not > only know to run but know the direction to run, the magnitude of course is > as fast as you can. > That logic is unfalsifiable just-so fallacy. If I asked why do we have teleportation on demand? You could answer "because animals with genes for magical powers would have many advantages over those who couldn't". Don't you see that you aren't questioning consciousness? You're just taking it for granted: There is consciousness, therefore it must have evolved. If it evolved it must have an evolutionary purpose. But consciousness violates conventional physics far more egregiously than magic. Telekinesis, precognition, levitation, time travel, invisibility...these are easy to justify through physics. Consciousness though, has no conceivable mechanism or justification in evolution - not even a hint of a possibility. > > >> > If all that is needed is math, then why have anything but data in the >> brain? >> > > Because you need machinery to process that data. > Yes, but why does the machinery need to see the data as colors and shapes, or feel it as an itch? This is what you are not explaining - the gap between data and experience of some kind. > > > Why have geometry when you can have glorious certain digital number >> sequences? >> > > Because looked at with the lens of complex numbers that digital number > sequence produces the qualia of geometry. Probably. > Why would it though? For what purpose? What about how? What might be the step by step recipe by which a number sequence produces a geometric qualia of a shape? > > > Are you talking about ion channels opening, or neurotransmitters being >> secreted >> > > When you talk about a car moving are you talking about the wheels turning > or the axle spinning? > Then you aren't talking about a wire between eyes and brain, you are talking about the whole head. > > > What's this about being adjectives though? >> > > You are the way matter behaves when it is organized in a Craigweinbergian > way. > I don't disagree, but try this: My body is the way matter behaves when it is reflecting/representing a Craigweinbergian time. > > > Geometry is a zombie. [...] Geometry cannot be derived from math alone, >> and neither can color, sound, touch, thought, or feeling. >> > > It must be grand being a "hard problem" theorist because it's the easiest > job in the world bar none, no matter how smart something is you just say > "yeah but it's not conscious" and there is no way anybody can prove you > wrong. > That's the third time you've said that. I guess you have adopted it as your new 'cannot understand ASCII text 'free will' mantra. You might consider that if you can't prove the statement wrong that there is just the tiniest possibility that it could be because it's true. > > >> >> Behavior is the only thing that determines success in life, and it >>> doesn't matter if "success" means making money or making friends or >>> avoiding predators or catching prey; >>> >> >> > Those things only mean success if you have meaning to begin with. >> > > Evolution has always had a very clear idea what success means, getting > genes into the next generation; with humans Evolution has determined that > the best way to do that is with a large brain because that produces > intelligence. > Eh, evolution if the vehicle is really only of marginal utility to non-scientists. Within the experience of the individual, the qualia of significance is even more of a driving force for a person than survival. > > the most valuable thing of all, surely, is sense itself. >> > > To you, to me too, but not to Evolution; and yet Evolution has produced > consciousness at least once and probably many billions of times; therefore > the conclusion is unavoidable, consciousness MUST be a byproduct of > something that IS valuable to Evolution. > Only if you assume that evolution is possible without consciousness in the first place, in which case there could not possibly be an opportunity for consciousness to appear. You are right about evolution not valuing sense though, that's because chance and teleonomy are driven by consequence - the flip side of choice/teleology. > > let's say that we have an AI which will pass the Turing Test > > > Then I'd say the AI is certainly intelligent and I would estimate that the > probability it is also conscious is the same probability that you are > conscious, and that is pretty high. > > > and we have an video simulator as well which has digitized every >> photograph and film of John Wayne and can produce CGI movies that pass the >> Scorcese Test. Is the whole system now John Wayne? >> > > If it behaves in exactly the same way that John Wayne would have behaved > (and not one of John Wayne's characters) in those circumstances then yes, > that would be John Wayne because it would be matter behaving in a > Johnwayneian way. However as a practical matter I have no idea how you > could determine that is what John Wayne would have done, so the Scorcese > Test is of little use. > But you couldn't determine that it is not what John Wayne would do or say either. Regardless, you are saying that you could bring John Wayne back from the dead just by doing a perfect impersonation of him. If you don't see why that isn't realistic, then I just have to accept that you live in a different world from me entirely. John Wayne is dead. It doesn't matter if your puppet or video simulation is uncanny beyond all scrutiny. Your view is impossibly naive. It is stating essentially that pretending is impossible. A sufficiently convincing liar can only be telling the truth. > > > if he starts saying how he's been resurrected by a computer and now >> lives again in movies? Is he telling the truth? >> > > When he says he's been resurrected he is telling the truth, when he says > he is again living in the movies he is not, John Waybe is not and has never > lived in the movies. > But maybe he does now, since you can talk to him and he responds from inside a movie. When you log into his movie in the middle of the night, you'll see and hear him snoring in John Wayne's voice on his bed in some old West ranch house. > >> >>> Why didn't I become a living being by myself? >>>> >>> >>> >> Because you lack the ability. >>> >> >> > How do you know though? >> > > Because I strongly suspect that you are not God. > So you don't know. > > > I don't have a dog in this fight at all. >> > > I flat out do not believe that. No sentient being would advance the > incoherent and astonishingly weak arguments that you have unless they > passionately wished for them to be true. > Not at all. I don't care what happens to be true, I'm just interested in understanding what it is that's true in the big picture. > > > computer simulations and services are so universally empty and >> non-sentient. >> > > It must be grand being a "hard problem" theorist because it's the easiest > job in the world bar none, no matter how smart something is you just say > "yeah but it's not conscious" and there is no way anybody can prove you > wrong. > I was right, I see, about your new mantra. Each time you use it I know it's because you have no argument. You know that no computer product has ever been anything but impersonal, but you adopt this sophistry and promissory computationalism because it's what you desperately want to believe. Super computer man will triumph over puny humans!! Just you wait and see!!! Craig > > John K Clark > > > > > > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To post to this group, send email to everything-list@googlegroups.com. Visit this group at http://groups.google.com/group/everything-list?hl=en. For more options, visit https://groups.google.com/groups/opt_out.