On Sep 6, 1:16 pm, Bruno Marchal <marc...@ulb.ac.be> wrote:
> On 06 Sep 2011, at 16:47, Craig Weinberg wrote:
>
>
>
> > On Sep 6, 3:13 am, Bruno Marchal <marc...@ulb.ac.be> wrote:
> >> On 06 Sep 2011, at 02:26, Craig Weinberg wrote:
>
> >>> When you say that mechanism explains qualia almost completely, are
> >>> you
> >>> talking about the 1-p (plural) sequestering of it, the non
> >>> computability of it, or is there something else? Does this mechanism
> >>> rely on the idea that meaning is transferred from something like a
> >>> person to a machine purely by a machine 'acting like' a person seems
> >>> to act?
>
> >> No. You need to duplicate the behavior of the components of the
> >> machine at some right level of substitution.
> >> "acting like" might be enough, but is somehow hard to define.
>
> > What accounts for substitution level?
>
> It is the level where your local constituants can be replaced by
> digital device without changing your private experience.

That doesn't account for the phenomena, it just defines the meaning of
the term.

> Of course
> only God knows it.

What accounts for that? Why should this factor be completely
inscrutable if it's a natural function of arithmetic?

> The account of the experiencer (of the
> substitution) does not count, as he may suffer from anosognosia.

Accounts of non-experiencers equally do not count as they may suffer
from HADD/prognosia.

>
> > Is it a hard threshold whereby
> > Pinocchio becomes a real boy suddenly, or is it a gradient of
> > escalating qualia?
>
> By definition, if the reconstituted person witness some new feelings,
> like having lost something, or having a headache, or whatever, it
> means that the level of substitution was not well chosen. People will
> accept such substitution when their friends will account for only
> slight secondary phenomena, like short nausea or something.

But what is the threshold at which a reconstituted person feels
anything at all? Is it a sudden instantiation of fully formed
awareness in a machine, or does the machine individually activate and
gradually integrate autonomous modules of quasi-awareness into a
psyche?

>
> > Either way seems insufficient for the same reason
> > that vanishing or absent qualia seem unlikely.
>
> Why? Chalmers makes clear that what would be astonishing, is fading
> qualia with no change in the behavior.

Because qualia appearing without some self-generated behavioral
precursor would be just as astonishing.

> Absent qualia, and vanishing
> sensations already occurs in many consciousness pathologies, in
> general due to brain troubles, like with Alzheimer.

Right, but the human sensations do not seem to spontaneously appear in
inorganic phenomena. There has never been a computer which suddenly
expressed fear of being turned off, nor has there been any sign that a
computer will ever evolve by itself into something that could behave
that way.

> If the copy of the
> brain is to gross, the survivor might loss a lot. Now, a "prolife"
> surgeon might well give a very gross digital brain to someone, without
> its consent, by arguing that the life of his patient is sacred
> (instead of the more computationalist *quality* of life notion).
> Fading qualia does not apply here.
>
> > If Pinocchio
> > spontaneously opens his eyes one day as a fully realized human being,
> > that would have odd subjective problems (do they project a simulated
> > history in their memory or do they know that they came into existence
> > today but know everything about the world and their own lives?)
>
> UDA illustrates the comp answer to all such questions. The memory of
> the past is always a construction of the current brain. What counts
> are all the logico-arithmetical relations encoded in the locally
> genuine machinery.

So he would never know that he was just born. I suppose there's
precedent for that kind of thing in hypnotic suggestion, etc. I think
that sense has a way of differentiating tangible experience from
memory or hallucination, even though our conscious cognitive version
of that can be compromised. I think there is a fundamental difference
between simulation and genuine experience, and that it is neither
rooted in arithmetic nor physics but in the connection between the
two.

>
> > or do
> > they gradually come online with morbid in-between states of tortured
> > semi-consciousness without means to express or relieve their
> > discomforts?
>
> That can happen with brain disease too. I guess the pioneer of
> immortality will not have an easy beginning in afterlife. This is not
> even for after tomorrow.

How do you know that the arithmetic doesn't have to be run from the
beginning (conception or birth) in real time? If you grew a perfect
adult clone, it would still be a newborn infant psychologically. The
fact that the adult psyche is not passed on from mother to child in
the womb makes me think that genuine experience is required to
generate significance of a certain qualitative character.

>
>
>
> >>> You would agree though that a ventriloquist does not transfer
> >>> the ability to feel, see, and understand to his dummy, I assume, so
> >>> doesn't that mean that the difference between a wooden dummy and a
> >>> machine capable of human feeling is just a matter of degree of
> >>> complexity.
>
> >> No. The dummy should behave the same in presence and absence of the
> >> ventriloquist. But even more, the "dummy" body should do the right
> >> computations.
>
> > To me, the computations are the ventriloquist. They are just a way for
> > the ventriloquist to save his act on disk, so that they can be
> > executed at a later time through the dummy.
>
> You confuse a particular program, with a universal one, having the
> same self-referential ability than you and me.

Our self-referential ability is an aspect of our awareness though. I
can't see a reason to assume that a universal program's self-reference
would be a form of awareness, any more than my image in the mirror is
an emulation of me.

>
>
>
>
>
> >>> If so, I think to claim that explains qualia almost
> >>> completely is not only premature, but, to my mind, somewhat
> >>> deceptive.
> >>> It's a con. (Sorry, not accusing you personally - just the
> >>> presumption
> >>> of the position).
>
> >> The theory explains why numbers develop many sort of beliefs. Some of
> >> them being lived as self-referentially true but non communicable, or
> >> non provable. They also follows axioms or theorems in theories of
> >> qualia done independently of comp.
>
> > It sounds promising, but without an example it's oblique to me. It's
> > critical acclaim of an idea that I haven't been able to get out of the
> > intriguing packaging. Isn't there some natural language example you
> > can give me of the theory - without variables or big picture
> > generalizations?
>
> I am specialized in theory and big picture. Examples abounds: look
> around you.

When I look around me I see a world that makes sense as a concrete
experience, not as an arithmetic abstraction.

>
> > Can you tell me a story about one particular number
> > and why it has developed a belief, or one particular qualia that is
> > explained by a particular computation theory?
>
> Yes. Take the life of Craig Weinberg as an example. Assuming comp,
> this illustrates a point of the theory: machines have necessarily an
> hard time to conceive that comp might be true. Comp explains why such
> an intuition is correct. In Plotinian term this is because some of our
> beliefs are true, or connected to the one-without-name.

Craig Weinberg used to assume comp though. What changed?

>
>
>
>
>
> >>>> I think the hard problem is 99% solved, and 100% metasolved. And
> >>>> given
> >>>> that the solution predicts how matter appears and behave, the only
> >>>> thing to do to get the whole picture is to derive physics from
> >>>> self-
> >>>> reference/machine's theology. This might lead to a refutation of
> >>>> comp,
> >>>> or to a refutation of the classical theory of knowledge (although I
> >>>> doubt this can be possible).
>
> >>> I think that the way it approaches the hard problem is itself self-
> >>> referential. By equating consciousness with computation to begin
> >>> with,
> >>> it makes sense that computation can be used to find itself to be the
> >>> source of consciousness. To me, the fact that consciousness is
> >>> private
> >>> and non-computable are the least descriptive possible aspects of
> >>> them.
>
> >> The theory explains the role of consciousness: it speeds up UMs
> >> relatively to other UMs.
>
> > That concurs with my ideas too. Cumulative entanglement is a way of
> > encapsulating or recapitulating computation (sort of literally 'coming
> > to a head') - but, I don't think it gets to the heart of the matter at
> > all. It doesn't address the qualitative quality of qualia.
>
> That the whole point of the theory. Modal logic makes it possible to
> handle qualitative features, and arithmetical self-reference offers
> the (variate) modal logics on a plateau, and this by distinguishing
> the communicable parts from the non communicable parts (and even the
> first person singular parts from the first person plural parts).
>
> But with all this, it would have been still possible that those qualia
> are epiphenomenal. The point here was to explain that the theory gives
> a role (and thus a 3-functional-role, of the kind capable of being
> selected by evolution) to consciousness (the quality) in the probable
> worlds/computations. So consciousness is not epiphenomenal. So comp
> explains the quality, the non communicability of the quality, and
> provides to consciousness a role in the beyond-cosmic competition
> between all the UMs and LUMs. They can also recognize themselves, as
> UMs or LUMs, and climb toward [no-name], from reality layers to
> reality layers.

Why would encapsulating information make sense in 3-p though? If the
computation already exists as is, what wants it to be re-presented in
a different way through awareness?

>
> > To say that
> > consciousness has a role in a machine universe is putting the cart
> > before the horse. It is the machine that has a role in supporting
> > consciousness.
>
> That is what most mechanists say. But they are ultimately wrong. It is
> the consciousness of the (L)UMs which select the consistent
> continuations, and this is concomitant with the deepening of the
> stories, and the "body apparitions".
> I recall that physics has become a branch of machine psychology, if
> the UDA reasoning is valid.

What makes you so sure that it's the machine that has consciousness
and not consciousness that perceives mechanism (among other things)?

>
> > To say that consciousness has a role in computation is
> > to say that a screenplay has a role within a movie set, but that the
> > stage and props are primitives from which movieness arises.
>
> ?

I'm saying that a theater exists to play movies for an audience. Your
view seems to me to be saying that if you build a room with seats and
a projector that a movie will appear when you turn out the lights.

>
> Consciousness selects the histories (like in the WM duplication), and
> in each history, it speeds-up the computations (like in engineering).

I agree consciousness recapitulates through it's selections, but there
would be no point in recapitulating anything if the genuine experience
weren't significant to begin with. Does one computation have any more
inherent significance than another?

>
>
>
> >>> It diminishes the relevance of how significance is achieved through
> >>> qualia, minimizes the intensity of biological commitment to survival
> >>> and things like the difference between pain and pleasure.
>
> >> I have no clue why you say so.
>
> > Because numbers don't have to care about anything.
>
> Ah?
> For 3-numbers, that is obvious? Nor does a brain or anything third-
> person describable. So if I say that a machine thinks, or that a
> number thinks, I am always talking of the first person associated
> locally and relatively with it. In that sense, numbers and machine can
> think, trivially (in the comp theory).
> So here, you are just saying that comp is false, but without providing
> an argument.

Why do first person local numbers have to care about anything?

>
>
>
> >>> I don't see
> >>> that a number can be spectacularly painful. Unless you're talking
> >>> about a particular arithmetic configuration that explains misery and
> >>> ecstasy or blue versus red?
>
> >> I don't see any problem here, other than mathematical questions. You
> >> can't refute Newton physics by saying that it cannot predict weather.
>
> > But you shouldn't you refute the use of Newton physics to predict
> > weather when people suggest that there is no problem in doing it?
>
> Well, I should refuse (not refute) the use of Newton for weather
> broadcasting. Because it would be non affordable. But the behavior of
> the weather does not refute Newton's laws, for some level of
> description.

The experience of seeing in color doesn't violate Maxwell's equations
at some level of description either, but neither do field equations of
any kind anticipate the basic visual qualities of colors themselves.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to