FYI: the formal stage is set to begin at age 11, which I equate with an adult 
AGI.  While at 11 we may not be as mature, or have as much experience, we are 
essentially adults, and if we achieved that level we woul dhave finished.

A Concrete Operational Stage begins at age 6, and might be a better goal, or 
anything below that.

One thing that is not usually explicitly spelled out, and is difficult and 
vague I know, is a basic level of operation that many of these projects are 
shooting for, and what Exact capabilities are required.
  I know we dont want to program to those abilities individually, but I would 
like to have a larger grasp and the exact objectives of many of the people 
here, and have a simplified system of grading without backing up to the 
standards of 'It must be a human',  or Turing test type stuff.

Wiki:
http://en.wikipedia.org/wiki/Theory_of_cognitive_development#Formal_operational_stage

The formal operational stage is the fourth and final of the stages of cognitive 
development of Piaget's theory. This stage, which follows the Concrete 
Operational stage, commences at around 11 years of age (puberty) and continues 
into adulthood. It is characterized by acquisition of the ability to think 
abstractly and draw conclusions from the information available. During this 
stage the young adult functions in a cognitively normal manner and therefore is 
able to understand such things as love, "shades of gray", and values. Lucidly, 
biological factors may be traced to this stage as it occurs during puberty and 
marking the entry to adulthood in Physiology, cognition, moral judgement 
(Kohlberg), Psychosexual development (Freud), and social development (Erikson). 
Some two-thirds of people do not successfully complete this stage, and "fixate" 
at the concrete operational stage.[1]

James Ratcliff

John Scanlon <[EMAIL PROTECTED]> wrote: I'm a little bit familiar with Piaget, 
and I'm guessing that the "formal 
stage of development" is something on the level of a four-year-old child. 
If we could create an AI system with the intelligence of a four-year-old 
child, then we would have a huge breakthrough, far beyond anything done so 
far in a computer.  And we would be approaching a possible singularity. 
It's just that I see no evidence anywhere of this kind of breakthrough, or 
anything close to it.

My ideas are certainly inadequate in themselves at the present time.  My 
Gnoljinn project is just about at the point where I can start writing the 
code for the intelligence engine.  The architecture is in place, the 
interface language, Jinnteera, is being parsed, images are being sent into 
the Gnoljinn server (along with linguistic statements) and are being 
pre-processed.  The development of the intelligence engine will take time, a 
lot of coding, experimentation, and re-coding, until I get it right.  It's 
all experimental, and will take time.

I see a singularity, if it occurs at all, to be at least a hundred years 
out.  I know you have a much shorter time frame.  But what is it about 
Novamente that will allow it in a few years time to comprehend its own 
computer code and intelligently re-write it (especially a system as complex 
as Novamente)?  The artificial intelligence problem is much more difficult 
than most people imagine it to be.


Ben Goertzel wrote:

> John,
>
> On 12/5/06, John Scanlon  wrote:
>>
>> I don't believe that the singularity is near, or that it will even occur. 
>> I
>> am working very hard at developing real artificial general intelligence, 
>> but
>> from what I know, it will not come quickly.  It will be slow and
>> incremental.  The idea that very soon we can create a system that can
>> understand its own code and start programming itself is ludicrous.
>
> First, since my birthday is just a few days off, I'll permit myself an
> obnoxious reply:
> 
> Ummm... perhaps your skepticism has more to do with the inadequacies
> of **your own** AGI design than with the limitations of AGI designs in
> general?
> 
>
> Seriously: I agree that progress toward AGI will be incremental, but
> the question is how long each increment will take.  My bet is that
> progress will seem slow for a while -- and then, all of a sudden,
> it'll seem shockingly fast.  Not necessarily "hard takeoff in 5
> minutes" fast, but at least "Wow, this system is getting a lot smarter
> every single week -- I've lost my urge to go on vacation" fast ...
> leading up to the phase of "Suddenly the hard takeoff is a topic for
> discussion **with the AI system itself** ..."
>
> According to my understanding of the Novamente design and artificial
> developmental psychology, the breakthrough from slow to fast
> incremental progress will occur when the AGI system reaches Piaget's
> "formal stage" of development:
>
> http://www.agiri.org/wiki/index.php/Formal_Stage
>
> At this point, the "human child like" intuition of the AGI system will
> be able to synergize with its "computer like" ability to do formal
> syntactic analysis, and some really interesting stuff will start to
> happen (deviating pretty far from our experience with human cognitive
> development).
>
> -- Ben
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?list_id=303
> 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



_______________________________________
James Ratcliff - http://falazar.com
New Torrent Site, Has TV and Movie Downloads! 
http://www.falazar.com/projects/Torrents/tvtorrents_show.php
 
---------------------------------
Everyone is raving about the all-new Yahoo! Mail beta.

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to