Ben,

As always, thanks for the well thought out reply...I am glad you could make
some sense of my ramblings...

Just a couple thoughts..

In relation to the subtle consciousness, or store consciousness, I believe
it interpenetrates all things equally.  So to speak of "more" or "less"
conscious, from this vantage point, is incorrect.  A wooden doll is
interpenetrated as well, but is not "conscious" of it because it lacks the
causes and conditions for thought to arise.  This does not negate the
presence of the store consciousness within it.

As far as the connection between the first and the fourth.  I think of the
store consciousness as a sea of potentialities,  When the appropriate causes
and conditions are in place, something will become manifest.  Your writing a
response to me springs from the store consciousness being stimulated by the
higher level consciousness.  Anger, lust, love and compassion, etc all are
potentialities within the SC.  It is useful to think of them as seeds.
whatever seed is watered, that is what will grow.   This is also how species
seemingly in disparate locations can seem to operate as a unit.  A bird in
France figures out how to open milk jugs after the milk man delivers them,
and soon after the birds in Kansas are doing it as well...

In the case of humans, it can be said that even the simple task of buying a
tie is not made without the influence of the collective..

I should state that I do not hold the store consciousness as the absolute
substance underlying all things.  In fact, it cannot be so, because the
store consciousness, although extremely subtle, is itself conditioned and
arises only dependently.  As such, it is not self existent and impermanent
and cannot be the Ultimate suchness of the Universe.

And this reaches the limits of my knowledge on the subject...

Thanks again for your thoughtful dialog.   Here in PA there's no one to talk
to about such things. I'm really a marginal character in society for sure :)

Kevin

----- Original Message -----
From: "Ben Goertzel" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Tuesday, December 03, 2002 8:26 PM
Subject: RE: [agi] How wrong are these numbers?


>
>
> Kevin,
>
> You raise a critical point, and  my thinking on this point is a bit
> unorthodox, as well as incomplete...
>
> There is a big unanswered question close to the heart of my theory of
mind,
> and this is the connection between Firstness and Fourthness.  I sum up
this
> question with the Orwell paraphrase "All things are conscious, but some
> things are more conscious than others."
>
> I'm a Peircean animist, in the sense that I believe consciousness is
> everywhere.  Yet, I believe that I'm more intensely conscious than a dog,
> and a dog is more intensely conscious than a flea, which is more intensely
> conscious than a virus, which is more intensely conscious than a
> molecule....
>
> One question is: Why is this?   But I'm not even sure of the standpoint
from
> which this question "Why?" is asked.
>
> Another question is: What are the specifics of this "law" connecting
> Firstness with degree-of-integrated-complexity (an aspect of Fourthness)?
> This is something that interests me greatly....
>
> Along these lines, I believe that if one constructs an AGI with a high
> degree of general intelligence, ensuing from a high degree of synergetic
> integrated complexity [the only way to get significant general
intelligence,
> I think], this AGI system *will have* a highly intense degree of
> consciousness, analogous to (though with a different subjective quality
> from) that of humans.
>
> But I don't have a justification for this belief of mine, because I don't
> have a solution to the so-called "hard problem" of consciousness.  All I
> have is an analysis of the hard problem of consciousness, which suggests
> that it may be possible to create artificial consciousness by creating AGI
> and watching the intense consciousness come along "for free."
>
> I suspect that in fact the "hard problem" will remain in some sense
> unsolved.  That is, the qualitative nature of the connection between
> intensity of consciousness and degree of general intelligence [for lack of
a
> better phrase] may remain "mysterious."  Yet, by experimenting with
> artificial minds, we may learn to quantify this relationship.
>
> Quantify it how?  Various sorts of artificial minds may report their
> subjective experiences -- in all sorts of subtle realms of awareness, as
> well as their own variety of everyday ordinary consciousness -- and we and
> they may learn rules relating their subjective experiences with known
> aspects of their physical implementations.
>
> Yet, even when such laws are known -- laws relating aspects of the
conscious
> experience of a mind to aspects of its "brain" -- this will still not
> resolve the core mystery of how consciousness-mind relates to
pattern-mind.
> But this "mystery" is ultimately not a question for science, perhaps...
and
> we don't need to articulate its solution in order to build and study minds
> that are genuinely conscious... we can *feel* its solution sometimes, but
> that's another story...
>
> -- Ben G
>
>
>
>
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> > Behalf Of maitri
> > Sent: Tuesday, December 03, 2002 8:04 PM
> > To: [EMAIL PROTECTED]
> > Subject: Re: [agi] How wrong are these numbers?
> >
> >
> > Ben,
> >
> > I think I followed most of your analysis :)
> >
> > I agree with most of what you stated so well.  The only
> > difficulty for me is
> > that the patterns, whether emergent in the individual or the group,
still
> > pertain to the gross level of mind and not the subtle levels of
> > consciousness.  It is quite OK, IMO, to disregard this subtle
> > aspect of mind
> > in your design for AGI, Strong AI or the Singularity.  But it should be
> > noted that this is disregarding what I would consider the predominant
> > capabilities of the human mind.
> >
> > For instance, in relation to memory capacity.  let's say I could live
for
> > the age of the universe, roughly 15 billion years.  I believe the human
> > mind(without enhancement of any kind) is capable of remembering
> > every detail
> > of every day for that entire lifespan.  A person can only
> > understand this if
> > they understand the non-gray matter portion of the Mind.  The mind you
> > describe I would call mind, small "m".  The Mind I am referring to is
> > capitol "M".  I believe it is an error to reduce memory and thought to
the
> > calculations that Kurzweil and Alan put forth.
> >
> > Clearly we have had incredibly fast processors, yet we can't even create
> > something that can effectively navigate a room, or talk to me, or
> > reason or
> > completely simulate an ant.  How can they reconcile that??  If they sy
"we
> > don't know how to program that yet".  then I say "well then stop
> > saying that
> > the singularity is near striclty because of processor speed\memory
> > projections.  Processor speed is irrelevant when you have no idea
> > how to use
> > them!"
> >
> > It is true that few humans reach this capacity i describe above.  I
would
> > call them human singularities.  Therer have only been a handful
> > in history.
> > But it's important to note that these capabilities are within
> > each of us.  I
> > will go as far to say that any computer system we develop, even one that
> > realizes all the promises of the singularity, can only match the
> > capacity of
> > the human Mind.  Why?  Because the universe is the Mind itself, and the
> > computational capacity of the universe is rather immense and cannot be
> > exceeded by something created within its own domain.
> >
> > In regards to the idea of what I believe will happen with an AGI.
> >  I believe
> > something rather incredible will emerge.  Right now, I can even think of
a
> > calculator as an incredible AI.  It is very specific in its function,
but
> > exceeds almost every human on the planet in what it can do.  An AGI,
once
> > mature, and because of its general utility, will be able to do
incredible
> > things.  As an example, when designing a car, the designers have to take
> > into account many variables including, aesthetics, human engineering,
wind
> > resistance, fuel efficiency, performance, cost, maintenance etc.  The
list
> > immense.  I believe an AGI will prove to be incredibly superior
> > in the areas
> > of engineering because of its ability to consider many more factors than
> > humans as well as its ability to discern patterns that most humans
cannot.
> > AGI will prove tremendously useful in areas like biotech,
> > engineering, space
> > science, etc, and can truly change things for the better IMO
> >
> > My only real question is in the area of invention and true innovation.
> > These often occur in humans in ways that are hard to understand.  People
> > have leaps of intuition on occasion.  They may make a leap in
> > understanding
> > something, even though they have no supporting information and their
> > inference does not come necessarily from patterns either.  I sometimes
> > believe that we *already* know everything we need to know or
> > invent, and we
> > uncover or discover them when we are so close to the problem at hand
that,
> > like the Zen koan, the answer just appears.  Where it comes from
> > is anyone's
> > guess...  So I guess what I'm saying is I can see some limited ability
for
> > an AGI to be creative, but I am not so sure that it will be able to make
> > leaps in intuition like humans can... At least for awhile :)
> >
> > Some day down the road.  I believe that an AGI with sufficient
> > capacity, may
> > become conscious and also be able to make use of the subtle
consciousness
> > and inutition etc. But lets not underestimate the human mind,
> > small "m", in
> > the meantime.  No one has come even close to matching it yet.
> >
> > Sorry for the length and for babbling..
> >
> > Kevin
> >
> >
> > ----- Original Message -----
> > From: "Ben Goertzel" <[EMAIL PROTECTED]>
> > To: <[EMAIL PROTECTED]>
> > Sent: Tuesday, December 03, 2002 6:59 PM
> > Subject: RE: [agi] How wrong are these numbers?
> >
> >
> > >
> > >
> > > Kevin,
> > >
> > > About "mind=brain" ...
> > >
> > > My own view of that elusive entity, "mind" is well-articulated
> > in terms of
> > > the philosophy of Charles S. Peirce, who considered there to be
several
> > > different levels on which mind could be separately considered.  Peirce
> > used
> > > three levels, but inspired by Jung and others, I have
> > introduced a fourth,
> > > and we prefer to think about:
> > >
> > > 1. First, raw experience
> > > 2. Second, physical reaction
> > > 3. Third, relationship and pattern
> > > 4. Fourth, synergy and emergence
> > >
> > > Each of these levels constitutes a different perspective on the
> > mind; and
> > > many important mental phenomena can only be understood by
> > considering them
> > > on several different levels.
> > >
> > > First corresponds roughly speaking to consciousness.  On this level,
> > > analysis has no more meaning than the color red, and everything
> > is simply
> > > what it presents itself as.  We will not speak about this level
> > further in
> > > this article, except to say that, in the Peircean perspective, it is
an
> > > aspect that everything has - even rocks and elementary particles - not
> > just
> > > human brains.
> > >
> > > Second, the level of physical reaction, corresponds to the "machinery"
> > > underlying intelligent systems.  In the case of humans, it's bodies
and
> > > brains; in the case of groups of humans, it's sets of bodies and
brains.
> > In
> > > fact, there's a strong case to be made that even in the case of
> > "individual"
> > > human minds, the bodies and brains of a whole set of humans is
involved.
> > No
> > > human mind makes sense in isolation; if a human mind is
> > isolated for very
> > > long, it changes into a different sort of thing than an ordinary human
> > mind
> > > as embedded in society.
> > >
> > > Third, the level of relationship and pattern, is the level that is
most
> > > commonly associated with the word "mind" in the English
> > language.  One way
> > > of conceiving of the mind is as the set of patterns associated with a
> > > certain physical system.  By "associated with" we mean the patterns in
> > that
> > > system, and the patterns that emerge when one considers that system
> > together
> > > with other systems in its habitual environment.  So, for instance, the
> > human
> > > mind may be considered as the set of patterns in the human
> > brain (both in
> > > its structure, and in its unfolding over time), and the
> > patterns that are
> > > observed when this brain is considered in conjunction with other
humans
> > and
> > > its physical environment.  This perspective may justly be claimed
> > > incomplete - it doesn't capture the experiential aspect of the
> > mind, which
> > > is First; or the physical aspect of the mind, which is Second.  But it
> > > captures a very important aspect of mind, mind as relationship.
> >  This view
> > > of mind in terms of "patterns" may be mathematically formalized, as
has
> > been
> > > done in a "loose" way in my book From Complexity to Creativity.
> > >
> > > Fourth, the level of synergy, has to do with groups of patterns that
> > emerge
> > > from each other, in what have been called "networks of emergence."   A
> > mind
> > > is not just a disconnected bundle of patterns, it's a complex,
> > > self-organizing system, composed of patterns that emerge from sets of
> > other
> > > patterns, in an interpenetrating way.
> > >
> > > The notion of synergy is particularly important in the context of
> > collective
> > > intelligence.  The "mind" of a group of people has many aspects -
> > > experiential, physical, relational and synergetic - but what
> > distinguishes
> > > it from the minds of the people within the group, is specifically the
> > > emergent patterns that exist only when the group is together,
> > and not when
> > > the group is separated and dispersed throughout the rest of society.
> > >
> > > One thing all this means is that the number of bits needed to realize
a
> > mind
> > > physically, does not equal the number of bits in the mind.  One cannot
> > > reduce mind to the Second level.  The physical substructure of a mind
is
> > the
> > > key unlocking the door to a cornucopia of emergent patterns between an
> > > embodied system and its environment (including other embodied
systems).
> > > These patterns are the mind, and they contain a lot more
> > information than
> > is
> > > explicit in the number of bits in the physical substrate.
> > >
> > > Regarding quantum or quantum gravity approaches to the mind, these are
> > > interesting to me, but from a philosophical perspective they're "just
> > > details" regarding how the physical universe organizes its patterns...
> > they
> > > don't really affect the above general picture....
> > >
> > > -- Ben G
> > >
> > >
> > >
> > > -- Ben G
> > >
> > >
> > > > -----Original Message-----
> > > > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> > > > Behalf Of maitri
> > > > Sent: Tuesday, December 03, 2002 6:10 PM
> > > > To: [EMAIL PROTECTED]
> > > > Subject: Re: [agi] How wrong are these numbers?
> > > >
> > > >
> > > > I've got a sawbuck in my pocket that says that you are seriously
> > > > underestimating the capacity of the human mind.
> > > >
> > > > In fact, its questionable whether you can emulate a mouse brain
> > adequately
> > > > with that amount of power.  I also think you guys are seriously
> > > > underestimating the memory capacity of the human mind.  Of course, I
> > view
> > > > the fundamental problem with your analysis as the mistaken
assumption
> > that
> > > > mind=brain.  There's a lot of anecdotal evidence that indicates
> > > > the error in
> > > > this line of thinking, and I can say that I personally have resolved
> > that
> > > > mind does not indeed equal brain.  If you ask me to prove it, I
> > > > cannot...But
> > > > I would think with the advent of quantum physics and the EPR
> > experiments
> > > > that even the hard science folks would begin to see that there
> > > > are a lot of
> > > > strange happenings going on in this universe of ours that we
> > > > don't begin to
> > > > understand intellectually.
> > > >
> > > > I suggest a reading of The Holographic Universe, if you get a
> > chance.  I
> > > > have only perused it myself, but I support the concepts that
> > it conveys.
> > > >
> > > > Good luck with your work!
> > > >
> > > > Kevin
> > > >
> > > >
> > > >
> > > >
> > > > ----- Original Message -----
> > > > From: "Alan Grimes" <[EMAIL PROTECTED]>
> > > > To: <[EMAIL PROTECTED]>
> > > > Sent: Tuesday, December 03, 2002 8:55 PM
> > > > Subject: Re: [agi] How wrong are these numbers?
> > > >
> > > >
> > > > > Ben Goertzel wrote:
> > > > > > The next question is: What's your corresponding estimate of
> > processing
> > > > > > power?
> > > > >
> > > > > Thanks for the prompt.
> > > > >
> > > > > Lets use the number 2^30 for the size of the memory which
> > will require
> > > > > 25 operations for each 32 bit word.
> > > > >
> > > > > 2^30 bytes == 2^28 words.
> > > > >
> > > > > We are going to cycle the thing at the 30hz rate of the human EEG
so
> > the
> > > > > memory throughput required for the cortex part of the application
> > > > > (ignoring the critical nuclei and cerebellum) will be
> > > > >
> > > > > 30*2^28 > 15*2^30 bytes/second total, 15GB/sec. (this is the most
> > > > > critical number).
> > > > >
> > > > > A pair of servers with 8gb/sec throughput should be plenty and
> > preserve
> > > > > the native organization as well...
> > > > >
> > > > > Now the CPU: We want to do roughly 20 operations to each of
> > 2^28 words
> > > > > every 30 seconds.
> > > > >
> > > > > 20*30*2^28 > 75*2^31 > ~160 GHZ. (raw) Each server is
> > responsible for
> > > > > 80. Split 16 ways, the load comes to 5ghz each. If we use a more
> > > > > conservative estimate of the typical EEG rate, say 15 hz,
> > the load for
> > > > > each processor comes to 2.5 ghz... (splitting into more servers
will
> > > > > probably not be practical due to network constraints).
> > > > >
> > > > > We could buy this for about $500,000.
> > > > >
> > > > > I would guess the cost to devel the hardware to emulate the
varrious
> > > > > nucleii, (many of which do nothing more than a few simple vector
> > > > > operations), would probably add about $150k for custom
> > cards (probably
> > > > > several iterations of such.)
> > > > >
> > > > > To make a meta-notation here, I am exploring this path towards AI
> > > > > because it is closer to a "sure fire" thing compared to a
> > more radical
> > > > > idea I have that I hope to see implemented in the next generation.
> > This
> > > > > more radical approach has some serious problems which may not be
> > > > > resolvable.
> > > > >
> > > > > > To emulate the massively parallel "information update rate" of
the
> > > > > > brain on N bits of memory, how many commodity PC processors are
> > > > > > required per GB of RAM?
> > > > >
> > > > > Well in the above I only mentioned 1GB total memory.
> > However there is
> > > > > almost certainly going to be an overhead above that gb... I
> > invite the
> > > > > reader to factor in a sensable overhead ratio (for pointers and
misc
> > > > > data structures) to the numbers above...
> > > > >
> > > > > --
> > > > > pain (n): see Linux.
> > > > > http://users.rcn.com/alangrimes/
> > > > >
> > > > > -------
> > > > > To unsubscribe, change your address, or temporarily deactivate
your
> > > > subscription,
> > > > > please go to
> http://v2.listbox.com/member/?[EMAIL PROTECTED]
> > > >
> > >
> > >
> > > -------
> > > To unsubscribe, change your address, or temporarily deactivate
> > > your subscription,
> > > please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
> > >
> >
> > -------
> > To unsubscribe, change your address, or temporarily deactivate your
> subscription,
> > please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
> >
>
>
> -------
> To unsubscribe, change your address, or temporarily deactivate your
> subscription,
> please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
>
> -------
> To unsubscribe, change your address, or temporarily deactivate your
subscription,
> please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
>


-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to