On 17 Jan 2012, at 21:20, Craig Weinberg wrote:



My point is that a Turing machine is not even truly universal,
let alone infinite.

A universal Turing machine is, by definition a machine, and machine are by definition finite.

The infinite tape plays a role of possible extending environment, and is not part of the universal machine, despite a widespread error (perhaps due to a pedagogical error of Turing).

That error comfort me in talking about universal numbers, and defining them by the relation

phi_u(<x, y>) = phi_x(y). u is the universal machine, x is a program and y is a data. "phi" refer to some other universal number made implicit (in my context it is explicited by elementary arithmetic).




It's an object oriented syntax that is limited to
particular kinds of functions, none of which include biological
awareness (which might make sense since biology is almost entirely
fluid-solution based.)

This worth than the notion of primitive matter. It is mystification of primitive matter.





Also, something can be infinite without encompassing everything. A line can be infinite in length without every point in existence having to lie on
that line.

If that's what you meant though, it's not saying much of anything
about the repertoire. A player piano has an infinite repertoire too.
So what?


To date, there is nothing we
(individually or as a race) has accomplished that could not in principle
also be accomplished by an appropriately programed Turing machine.

Even if that were true, no Turing machine has ever known what it has
accomplished,

Assuming you and I aren't Turing machines.

It would be begging the question otherwise.


so in principle nothing can ever be accomplished by a
Turing machine independently of our perception.

Do asteroids and planets exist "out there" even if no one perceives them?

They don't need humans to perceive them to exist, but my view is that
gravity is evidence that all physical objects perceive each other. Not
in a biological sense of feeling, seeing, or knowing, but in the most
primitive forms of collision detection, accumulation, attraction to
mass, etc.

I can agree with that. This is in the spirit of Everett, which treat observation as interaction. But there is no reason to associate primitive qualia and private sensation from that. It lacks the "retrieving memory" and self-reference.





What is an
'accomplishment' in computational terms?

I don't know.



You can't build it out of uncontrollable living organisms.
There are physical constraints even on what can function as a simple
AND gate. It has no existence in a vacuum or a liquid or gas.

Just as basic logic functions are impossible under those ordinary
physically disorganized conditions, it may be the case that awareness can only develop by itself under the opposite conditions. It needs a variety of solids, liquids, and gases - very specific ones. It's not Legos. It's alive. This means that consciousness may not be a concept at all - not generalizable in any way. Consciousness is the opposite, it is a specific enactment of particular events and materials. A brain can only show us that a person is a live, but not who that person is. The who cannot be simulated because it is an unrepeatable event in the cosmos. A computer is not a single event. It is parts which have been assembled together. It did not replicate itself from a single living
cell.

You can't make a machine that acts like a person without
it becoming a person automatically. That clearly is ridiculous to
me.

What do you think about Strong AI, do you think it is possible?

The whole concept is a category error.

Let me use a more limited example of Strong AI. Do you think there is
any
existing or past human profession that an appropriately built android
(which is driven by a computer and a program) could not excel at?

Artist, musician, therapist, actor, talk show host, teacher,
caregiver, parent, comedian, diplomat, clothing designer, director,
movie critic, author, etc.

What do you base this on? What is it about being a machine that precludes
them from fulfilling any of these roles?

Machines have no feeling.

What I say three times is true.
What I say three times is true.
What I say three times is true.
(Lewis Carroll, The Hunting of the Snark).


These kinds of careers rely on sensitivity
to human feeling and meaning. They require that you care about things
that humans care about. Caring cannot be programmed. That is the
opposite of caring, because programming requires no investment by the
programmed. There is no subject in a program, only an object
programmed to behave in a way that seems like it could be a subject in
some ways.

If you define the subject by the knower, believability by provability, and if you accept the classical theory of knwoledge (the axioms: Kp- >p, K(p->q)->(Kp->Kq)). Then it is a theorem that a subject exist for machine, and indeed that machine have to be puzzled by the relation between that subject and their body.

Now, there is no reason to expect a *human* subject. Unless the machine is a copy of a human at some genuine level. But most machines are not a priori human machines.

Bruno


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to