Re: [singularity] Multi-Multi-....-Multiverse

2008-01-28 Thread Ben Goertzel
Can you define what you mean by decision more precisely, please?


 OK, but why can't they all be dumped in a single 'normal' multiverse?
 If traveling between them is accommodated by 'decisions', there is a
 finite number of them for any given time, so it shouldn't pose
 structural problems. Another question is that it might be useful to
 describe them as organized in a tree-like structure, according to
 navigation methods accessible to an agent. If you represent
 uncertainty by being in 'more-parent' multiverse, it expresses usual
 idea with unusual (and probably unnecessarily restricting) notation.

 --
 Vladimir Nesovmailto:[EMAIL PROTECTED]

 -
 This list is sponsored by AGIRI: http://www.agiri.org/email
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;




-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

If men cease to believe that they will one day become gods then they
will surely become worms.
-- Henry Miller

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90503257-2c3931


Re: [singularity] Multi-Multi-....-Multiverse

2008-01-28 Thread Vladimir Nesov
On Jan 28, 2008 2:17 PM, Ben Goertzel [EMAIL PROTECTED] wrote:
 Can you define what you mean by decision more precisely, please?

That's difficult, I don't have it formalized. Something like
application of knowledge about the world, it's likely to end up an
intelligence-definition-complete problem...

-- 
Vladimir Nesovmailto:[EMAIL PROTECTED]

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90505077-ab77a2


Re: [singularity] Wrong focus?

2008-01-28 Thread Mike Tintner

Gudrun: I think this is not about
intelligence, but it is about our mind being inter-dependent (also via
evolution) with senses and body.

Sorry, I've lost a subsequent post in which you went on to say that the very 
terms mind and body in this context were splitting up something that 
can't be split up. Would you (or anyone else) like to discurse - riff - on 
that? However casually...


The background for me is this:  there is a great, untrumpeted revolution 
going on, which is called Embodied Cognitive Science. See Wiki. That is all 
founded on the idea of the embodied mind. Cognitive science is based on 
the idea that thought is a program - which can in principle be instantiated 
on any computational machine - and is a science founded on AI/ computers. 
Embodied cog sci is Cog Sci Stage 2 and is based on the idea that thought is 
a brain-and-body affair - and cannot take place without both - and is a 
science founded on robotics.


But the whole terminology of this new science - embodied mind - is still 
lopsided, still unduly deferential - and needs to be replaced. So I'm 
interested in any thoughts related to this, however rough.



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90517727-b31b76


Re: [singularity] Wrong focus?

2008-01-28 Thread x
On Jan 28, 2008 4:36 AM, Stathis Papaioannou [EMAIL PROTECTED] wrote:

 Are you simply arguing that an embodied AI that can interact with the
 real world will find it easier to learn and develop, or are you
 arguing that there is a fundamental reason why an AI can't develop in
 a purely virtual environment?

I think the answer to the above is obvious, but the more interesting
question is whether it even makes sense to speak of a mind
independent of some environment of interaction, whether physical or
virtual.

Think agency, folks.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90547696-533cd5


Re: [singularity] Wrong focus?

2008-01-28 Thread gifting

Quoting Mike Tintner [EMAIL PROTECTED]:


Gudrun: I think this is not about
intelligence, but it is about our mind being inter-dependent (also via
evolution) with senses and body.

Sorry, I've lost a subsequent post in which you went on to say that the very
terms mind and body in this context were splitting up something that
can't be split up. Would you (or anyone else) like to discurse - riff - on
that? However casually...


I said in so many words
Though, even this is kind of wrong, because we behave
like there
is a split  between senses, body and mind.
They are more interconnected or however  you would like
to phrase it.
Problem of dualist thinking.



The background for me is this:  there is a great, untrumpeted revolution
going on, which is called Embodied Cognitive Science. See Wiki. That is all
founded on the idea of the embodied mind. Cognitive science is based on
the idea that thought is a program - which can in principle be instantiated
on any computational machine - and is a science founded on AI/ computers.
Embodied cog sci is Cog Sci Stage 2 and is based on the idea that thought is
a brain-and-body affair - and cannot take place without both - and is a
science founded on robotics.

But the whole terminology of this new science - embodied mind - is still
lopsided, still unduly deferential - and needs to be replaced. So I'm
interested in any thoughts related to this, however rough.


Mike
Embodied Cog sci - is the idea that there is no thought without sensation,
emotion and
movement .
(no mentation without re-presentation..?  hmm... still an idea in progress)
We need to find ways of reconnecting the pieces that language has dissected.
Hey, you're
an artist.. do me a photo or model :). -

I do videos and installations, perhaps films. I write texts. I invent, too.
I think one would have to do what AI people to, invent an embodied AGI,
something that has a form of consciousness, senses, movement, body and is
really humorous, for a change.

Stathis:  Are you simply arguing that an embodied AI that can interact 
with the

real world will find it easier to learn and develop, or are you
arguing that there is a fundamental reason why an AI can't develop in
a purely virtual environment?
Mike
The latter. I'm arguing that a disembodied AGI has as much chance of 
getting to

know,
understand and be intelligent about the world as Tommy - a deaf, dumb 
and blind

and
generally sense-less kid, that's totally autistic, can't play any 
physical game

let alone
a mean pin ball, and has a seriously impaired sense of self , (what's the name
for that
condition?) - and all that is even if the AGI *has* sensors. Think of a
disembodied AGI
as very severely mentally and physically disabled from birth - you wouldn't do
that to a
child, why do it to a computer?  It might be able to spout an encyclopaedia,
show you a
zillion photographs, and calculate a storm but it wouldn't understand, or be
able to
imagine/ reimagine, anything. As I indicated, a proper, formal argument 
for this

needs to
be made - and I and many others are thinking about it - and shouldn't 
be long in

forthcoming, backed with solid scientific evidence. There is already a lot of
evidence
via mirror neurons that you do think with your body, and it just keeps 
mounting.


While doing my research, | got the impression that disembodied might be 
equal or

similar to spirit (holy spirit). this comes from religions and religious
ideologies and terminology. A disembodied, extracted mind (spirit) also refers
to purity. Extract, pure or purified, or a mind in a mind like a voice 
in one's

head. The voice from the aether, radio television signals, all form of
disembodied stuff. (Okay embodied via radiowaves and caught in boxes like
radios, I am a bit ironic here).
I am not sure, if this is about the idea of an extract of purity, 
something that

moves (??) in a purely disembodied world, an idea of an afterlife (again
religion), a pure spirit or mind  interconnected with whatever is left (I am
thinking about what Moravec said, I have to look into my thesis to find his
quote).
I like it as science fiction, but it also scares me. It seems to me that this
disembodied AGI is the product of people who are tired of the burden body,
their own bodies. They are tired of a body that screams mortality, while a
pure mind might promise immortality. Just some thoughts.
I think an analogy to alchemists might not be to far fetched.

Gudrun




-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?;



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90635570-4ef122


Re: [singularity] Wrong focus?

2008-01-28 Thread x
On Jan 28, 2008 7:56 AM, Mike Tintner [EMAIL PROTECTED] wrote:

 X:Of course this is a variation on the grounding problem in AI.  But
 do you think some sort of **absolute** grounding is relevant to
 effective interaction between individual agents (assuming you think
 any such ultimate grounding could even perform a function within a
 limited system), or might it be that systems interact effectively to
 the extent their dynamics are based on **relevant** models, regardless
 of even proximate grounding in any functional sense?

 Er.. my body couldn't make any sense of this :). Could you be clearer giving
 examples of the agents/systems  and what you mean by absolute/ proximate
 grounding?

I see that you're talking about interaction between systems considered
to be minds, and highlighting the question of what is necessary to
form a shared basis for **relevant** interaction.  I agree that a
mind without an environment of interaction is meaningless, in the
same way that any statement (or pattern of bits) without context is
meaningless.  However, I would argue that just as context is never
absolute, nor is there ever any need for it to be absolute, indeed for
practical (functional) reasons it can never be absolute, embodiment
need not be absolute, complete, or ultimately grounded.

I use the term system to refer as clearly as possible to any
distinct configuration of inter-related objects, with the implication
that the system must be physically realizable, therefore it models
neither infinities or infinitesimals, nor could it model a Cartesian
singularity of Self.

I use the term agent to refer as clearly as possible to a system
exhibiting agency, i.e. behavior recognized as intentional, i.e.
operating on behalf of an entity.  It may be useful here to point out
that recognition of agency inheres in the observer (including the case
of the observer being the agent-system itself), rather than agency
being somehow an objectively measurable property of the system itself.
 Further, the entity which is the principal behind any agency is
entirely abstract (independent of any physical instantiation.)
[Understanding this is key to various paradoxes of personal identity.]

I distinguish between absolute and proximate grounding in regard
to the functional (and information-theoretic) impossibility of a
system modeling it's entire chain of connections to ultimate
reality, while in actuality any system interacts only with its
proximate environment, just as to know an object is not to know what
it is but to know its interface.  To presume to know more would be
to presume some privileged mode of knowledge.

So in short, I agree with you that embodiment is essential to
meaningful interaction, thus for there to be agency, thus for there to
be a Self for the mind to know.  But I extend this and emphasize
that it's not necessary that such embodiment be physical, nor that
it be logically grounded in ultimate reality, but rather, that
interaction is relevant and meaningful to the extent that some
(necessarily partial and arbitrarily distant from reality) context
is shared.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90632239-135dac


Re: [singularity] Wrong focus?

2008-01-28 Thread gifting

Quoting [EMAIL PROTECTED]:


On Jan 28, 2008 7:56 AM, Mike Tintner [EMAIL PROTECTED] wrote:


X:Of course this is a variation on the grounding problem in AI.  But
do you think some sort of **absolute** grounding is relevant to
effective interaction between individual agents (assuming you think
any such ultimate grounding could even perform a function within a
limited system), or might it be that systems interact effectively to
the extent their dynamics are based on **relevant** models, regardless
of even proximate grounding in any functional sense?

Er.. my body couldn't make any sense of this :). Could you be clearer giving
examples of the agents/systems  and what you mean by absolute/ proximate
grounding?


I see that you're talking about interaction between systems considered
to be minds, and highlighting the question of what is necessary to
form a shared basis for **relevant** interaction.  I agree that a
mind without an environment of interaction is meaningless, in the
same way that any statement (or pattern of bits) without context is
meaningless.  However, I would argue that just as context is never
absolute, nor is there ever any need for it to be absolute, indeed for
practical (functional) reasons it can never be absolute, embodiment
need not be absolute, complete, or ultimately grounded.

I use the term system to refer as clearly as possible to any
distinct configuration of inter-related objects, with the implication
that the system must be physically realizable, therefore it models
neither infinities or infinitesimals, nor could it model a Cartesian
singularity of Self.

I use the term agent to refer as clearly as possible to a system
exhibiting agency, i.e. behavior recognized as intentional, i.e.
operating on behalf of an entity.  It may be useful here to point out
that recognition of agency inheres in the observer (including the case
of the observer being the agent-system itself), rather than agency
being somehow an objectively measurable property of the system itself.
 Further, the entity which is the principal behind any agency is
entirely abstract (independent of any physical instantiation.)
[Understanding this is key to various paradoxes of personal identity.]

I distinguish between absolute and proximate grounding in regard
to the functional (and information-theoretic) impossibility of a
system modeling it's entire chain of connections to ultimate
reality, while in actuality any system interacts only with its
proximate environment, just as to know an object is not to know what
it is but to know its interface.  To presume to know more would be
to presume some privileged mode of knowledge.

So in short, I agree with you that embodiment is essential to
meaningful interaction, thus for there to be agency, thus for there to
be a Self for the mind to know.  But I extend this and emphasize
that it's not necessary that such embodiment be physical, nor that
it be logically grounded in ultimate reality, but rather, that
interaction is relevant and meaningful to the extent that some
(necessarily partial and arbitrarily distant from reality) context
is shared.



Vow, this is well worded, structured in a really nice set of feedback loops.

What is a non physical embodiment. I would like to know more about this.

If we have a form of embodied AGI (with all the definitions and descriptions
above, even a non physical one not being grounded in an ultimate reality), and
there is space for movement/motion (see other posts and definitions for
movement), has anybody thought about DESIRE. How could desire come into this.
What kind of mind is desirable?



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?;



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90638550-c0e5be


Re: [singularity] Wrong focus?

2008-01-28 Thread x
On Jan 28, 2008 10:02 AM,  [EMAIL PROTECTED] wrote:

 Vow, this is well worded, structured in a really nice set of feedback loops.

Some like my writing very much; others find it off-putting. I tend to
err toward excessive abstraction, expecting that others will ask for
supporting detail and/or clarification as desired.  I think I'm
correct in this expectation, but significantly off in my estimation of
the extent of the desire.  ;-)

 What is a non physical embodiment. I would like to know more about this.

Simply put, non-physical embodiment refers to an instance of mind
functioning within an abstract computational environment as opposed to
the physical environment we commonly assume.  It's worthwhile to note,
however, that from a necessarily subjective viewpoint, one cannot
reliably discern the degree of abstraction of one's environment from
actual reality.  [Thus my scare-quoting of the term reality as it
can be referred to but never defined.]  Note also that a
computational environment does not necessarily entail a simulation,
although these concepts are commonly conflated in this forum.


 If we have a form of embodied AGI (with all the definitions and descriptions
 above, even a non physical one not being grounded in an ultimate reality), and
 there is space for movement/motion (see other posts and definitions for
 movement), has anybody thought about DESIRE. How could desire come into this.

It seems to me that in coherent, systems-theoretic terms, desire
refers to perceived distance between an agent's internal
values-complex and the perceived state of its environment.  So
intentional action serves simply to reduce this perceived distance to
zero (via execution of more or less intelligent internally encoded
instrumental principles.)  To the extent that the relevant aspects of
this interaction can be said to be fully specified, then the desired
future state can be called a goal.

 What kind of mind is desirable?

Non-sequitur.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90654113-309ea5


Re: [singularity] Wrong focus?

2008-01-28 Thread x
On Jan 28, 2008 6:43 AM, Mike Tintner [EMAIL PROTECTED] wrote:

 Stathis:  Are you simply arguing that an embodied AI that can interact with
 the
  real world will find it easier to learn and develop, or are you
  arguing that there is a fundamental reason why an AI can't develop in
  a purely virtual environment?

 The latter. I'm arguing that a disembodied AGI has as much chance of getting
 to know, understand and be intelligent about the world as Tommy - a deaf,
 dumb and blind and generally sense-less kid, that's totally autistic, can't
 play any physical game let alone a mean pin ball, and has a seriously
 impaired sense of self , (what's the name for that condition?) - and all
 that is even if the AGI *has* sensors. Think of a disembodied AGI as very
 severely mentally and physically disabled from birth - you wouldn't do that
 to a child, why do it to a computer?  It might be able to spout an
 encyclopaedia, show you a zillion photographs, and calculate a storm but it
 wouldn't understand, or be able to imagine/ reimagine, anything. As I
 indicated, a proper, formal argument for this needs to be made - and I and
 many others are thinking about it - and shouldn't be long in forthcoming,
 backed with solid scientific evidence. There is already a lot of evidence
 via mirror neurons that you do think with your body, and it just keeps
 mounting.

Of course this is a variation on the grounding problem in AI.  But
do you think some sort of **absolute** grounding is relevant to
effective interaction between individual agents (assuming you think
any such ultimate grounding could even perform a function within a
limited system), or might it be that systems interact effectively to
the extent their dynamics are based on **relevant** models, regardless
of even proximate grounding in any functional sense?

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90569120-23ee79


Re: [singularity] Wrong focus?

2008-01-28 Thread Thomas McCabe
On Jan 28, 2008 9:43 AM, Mike Tintner [EMAIL PROTECTED] wrote:

 Stathis:  Are you simply arguing that an embodied AI that can interact with
 the
  real world will find it easier to learn and develop, or are you
  arguing that there is a fundamental reason why an AI can't develop in
  a purely virtual environment?

 The latter. I'm arguing that a disembodied AGI has as much chance of getting
 to know, understand and be intelligent about the world as Tommy - a deaf,
 dumb and blind and generally sense-less kid, that's totally autistic, can't
 play any physical game let alone a mean pin ball, and has a seriously
 impaired sense of self , (what's the name for that condition?) - and all
 that is even if the AGI *has* sensors. Think of a disembodied AGI as very
 severely mentally and physically disabled from birth - you wouldn't do that
 to a child, why do it to a computer?

Whew. That's... let me count... eleven anthropomorphic comparisons in
one paragraph. You cannot use anthropomorphic thinking when dealing
with AIs. An AI is more different from you than you are from a yeast
cell. Both yeast cells and humans, after all, share the same basic
biochemistry and the same design process (natural selection). Humans
and AIs do not.

  It might be able to spout an
 encyclopaedia, show you a zillion photographs, and calculate a storm but it
 wouldn't understand, or be able to imagine/ reimagine, anything.

This is precisely what unintelligent computers do. You're describing
the behavior of an unintelligent system, not an AGI (or even a
modern-day AI). AI can already do much better than this. In 1999,
computers were composing music, poetry, art, and literature, all
without any kind of robotic apparatus.

 As I
 indicated, a proper, formal argument for this needs to be made - and I and
 many others are thinking about it - and shouldn't be long in forthcoming,
 backed with solid scientific evidence. There is already a lot of evidence
 via mirror neurons that you do think with your body, and it just keeps
 mounting.

At this point, you're starting to sound like the creationists. Any day
now, you know, they're going to present hard, peer-reviewed evidence
for intelligent design. Any day now...


 -
 This list is sponsored by AGIRI: http://www.agiri.org/email
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;


 - Tom

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90811106-19bcc6


Re: [singularity] Wrong focus?

2008-01-28 Thread Thomas McCabe
On Jan 28, 2008 7:16 AM, Mike Tintner [EMAIL PROTECTED] wrote:
 Gudrun: I think this is not about
 intelligence, but it is about our mind being inter-dependent (also via
 evolution) with senses and body.

 Sorry, I've lost a subsequent post in which you went on to say that the very
 terms mind and body in this context were splitting up something that
 can't be split up. Would you (or anyone else) like to discurse - riff - on
 that? However casually...

 The background for me is this:  there is a great, untrumpeted revolution
 going on, which is called Embodied Cognitive Science.

embodied cognitive science gets 5,310 hits on Google. cognitive
science gets 2,730,000 hits. Please back up your statements,
especially ones which talk about revolutions in any field.

 See Wiki. That is all
 founded on the idea of the embodied mind.

embodied cognitive science gets 520 hits on Google Scholar, as
compared to 296,000 for cognitive science. Many individual
researchers have published more papers than this (Euler and Erdos come
to mind).

 Cognitive science is based on
 the idea that thought is a program - which can in principle be instantiated
 on any computational machine - and is a science founded on AI/ computers.
 Embodied cog sci is Cog Sci Stage 2

Please stop posting audacious claims without any evidence. Cog sci is
a huge field, with thousands of full-time researchers worldwide.

 and is based on the idea that thought is
 a brain-and-body affair - and cannot take place without both

Please stop posting audacious claims without any evidence.

 - and is a
 science founded on robotics.

Name one case when attaching a robotic apparatus (of any sort) to a
computer gave it additional intellectual capacity.

 But the whole terminology of this new science - embodied mind - is still
 lopsided, still unduly deferential - and needs to be replaced. So I'm
 interested in any thoughts related to this, however rough.


 -
 This list is sponsored by AGIRI: http://www.agiri.org/email
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;


 - Tom

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90838229-e3f9cd


Re: [singularity] Wrong focus?

2008-01-28 Thread Stathis Papaioannou
On 29/01/2008, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
 On Jan 28, 2008 4:36 AM, Stathis Papaioannou [EMAIL PROTECTED] wrote:

  Are you simply arguing that an embodied AI that can interact with the
  real world will find it easier to learn and develop, or are you
  arguing that there is a fundamental reason why an AI can't develop in
  a purely virtual environment?

 I think the answer to the above is obvious, but the more interesting
 question is whether it even makes sense to speak of a mind
 independent of some environment of interaction, whether physical or
 virtual.

Could that just mean in the limiting case that one part of a physical
object is a mind with respect to another part?




-- 
Stathis Papaioannou

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90839892-81029c


Re: [singularity] Wrong focus?

2008-01-28 Thread Mike Tintner

Tom:embodied cognitive science gets 5,310 hits on Google. cognitive
science gets 2,730,000 hits. Please back up your statements,
especially ones which talk about revolutions in any field.

Check out the wiki article - look at the figures at the bottom such as 
Lakoff  co  Google them.  Check out Pfeiffer. Note how many recent books 
in philosophy, psychology and cognitive science are focussing on embodiment 
in one way or other. Check out the Berkeley/ California configuration of 
these guys. Check out morphological computation - and the relevant 
conference. Check out Ramachandran:


Without a doubt it is one of the most important discoveries ever made about 
the brain, Mirror neurons will do for psychology what DNA did for biology. 
They will provide a unifying framework and help explain a host of mental 
abilities that have hitherto remained mysterious...


Read Sandra Blakeslee - The Body has a Mind of its Own - also just out. [She 
did Jeff Hawkins before].


Even s.o. like Ben, if you track his development - he can correct me - is 
using embodied more and more - and promoting virtually embodied AI's.


Unlike most mainstream cog. sci. , the embodied version, you'll find, 
really is scientific and has a commitment to scientific experiment and 
testing of its ideas.


It's as I said an untrumpeted revolution but if you think about it, it's 
inevitable.  Just try thinking without sensation, emotion and movement. 
Brains in a vat are fine for philosophers but they just haven't worked for 
any kind of AGI, or any of the faculties that AGI needs. [And stay cutting 
edge).






-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90868213-6926fc


Re: [singularity] Wrong focus?

2008-01-28 Thread Stathis Papaioannou
On 29/01/2008, Mike Tintner [EMAIL PROTECTED] wrote:

 The latter. I'm arguing that a disembodied AGI has as much chance of getting
 to know, understand and be intelligent about the world as Tommy - a deaf,
 dumb and blind and generally sense-less kid, that's totally autistic, can't
 play any physical game let alone a mean pin ball, and has a seriously
 impaired sense of self , (what's the name for that condition?) - and all
 that is even if the AGI *has* sensors. Think of a disembodied AGI as very
 severely mentally and physically disabled from birth - you wouldn't do that
 to a child, why do it to a computer?  It might be able to spout an
 encyclopaedia, show you a zillion photographs, and calculate a storm but it
 wouldn't understand, or be able to imagine/ reimagine, anything.

How can you tell the difference between sensory input from a real
environment and that from a virtual environment?




-- 
Stathis Papaioannou

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90867376-c56f6a


Re: [singularity] Wrong focus?

2008-01-28 Thread Thomas McCabe
On Jan 28, 2008 6:48 PM, Mike Tintner [EMAIL PROTECTED] wrote:
 Tom:embodied cognitive science gets 5,310 hits on Google. cognitive
 science gets 2,730,000 hits. Please back up your statements,
 especially ones which talk about revolutions in any field.

 Check out the wiki article - look at the figures at the bottom such as
 Lakoff  co  Google them.  Check out Pfeiffer. Note how many recent books
 in philosophy, psychology and cognitive science are focussing on embodiment
 in one way or other. Check out the Berkeley/ California configuration of
 these guys. Check out morphological computation - and the relevant
 conference.

Quite frankly, I don't have the time to go reading through an entire
field of stuff simply to prove a point.

 Check out Ramachandran:

 Without a doubt it is one of the most important discoveries ever made about
 the brain, Mirror neurons will do for psychology what DNA did for biology.
 They will provide a unifying framework and help explain a host of mental
 abilities that have hitherto remained mysterious...

Mirror neurons *do* seem like an important discovery in cognitive
science, but they're specific to humans (and other animals with
complex nervous systems), not to intelligences in general. The general
principle (look at another system and copy its behavior) can be
applied just as easily to purely electronic systems as physical ones.
Remember COPYCAT
(http://en.wikipedia.org/wiki/Copycat_%28software%29)?

 Read Sandra Blakeslee - The Body has a Mind of its Own - also just out. [She
 did Jeff Hawkins before].

The author is a professional writer, not a scientist, and has no
published papers that I can find. To quote from the front page of the
book's website (http://www.thebodyhasamindofitsown.com):

Your body has a mind of its own. You know it's true. You can feel it,
you can sense it, even though it may be hard to articulate. You know
your body is more than just a meat-vehicle for your mind to cruise
around in, but how deeply are mind, brain and body truly interwoven?

This is clearly 'pop sci' writing, probably with little technical content.

 Even s.o. like Ben, if you track his development - he can correct me - is
 using embodied more and more - and promoting virtually embodied AI's.

 Unlike most mainstream cog. sci. , the embodied version, you'll find,
 really is scientific and has a commitment to scientific experiment and
 testing of its ideas.

Please stop posting audacious claims without references. Claiming that
all of cognitive science is unscientific, while some small subfield
is scientific, certainly qualifies as audacious.

 It's as I said an untrumpeted revolution but if you think about it, it's
 inevitable.  Just try thinking without sensation, emotion and movement.
 Brains in a vat are fine for philosophers but they just haven't worked for
 any kind of AGI, or any of the faculties that AGI needs. [And stay cutting
 edge).

See http://www.singinst.org/upload/LOGI//foundations.html.





 -
 This list is sponsored by AGIRI: http://www.agiri.org/email
 To unsubscribe or change your options, please go to:
 http://v2.listbox.com/member/?;


 - Tom

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604id_secret=90874087-bdefc8