Mike Tintner wrote:
yes, but that's not precise enough.
you have to have a task example that focusses what is going on
adaptively... you're not specifiying what kinds of essays/ maths etc
what challenge does the problem pose to the solver's existing rules
for reaching a goal?
how does the solver adapt their rules to solve it?
...
----- Original Message ----- From: "Charles D Hixson"
<[EMAIL PROTECTED]>
To: <singularity@v2.listbox.com>
Sent: Saturday, April 28, 2007 6:23 AM
Subject: Re: [singularity] Why do you think your AGI design will work?
Mike Tintner wrote:
Hi,
I strongly disagree - there is a need to provide a definition of
AGI - not necessarily the right or optimal definition, but one that
poses concrete challenges and focusses the mind - even if it's only
a starting-point. The reason the Turing Test has been such a
successful/ popular idea is that it focusses the mind.
...
OK. A program is an AGI if it can do a high school kid's homework
and get him good grades for 1 week (during which there aren't any
pop-quizes, mid-terms, or other in-school and closed book exams.
That's not an optimal definition, but if you can handle essays and
story problems and math and biology as expressed by a teacher, then
you've got a pretty good AGI.
-----
...
...
But the point is, a precise definition is useless. Turing's test was
established so that (paraphrase)"If a program could do this, then you
would have to agree that it was intelligent.", it wasn't intended as a
practical test that some future program would pass. If we were to start
passing laws about the rights and privileges of intelligent programs,
then a "necessary & sufficient" test would be needed. To do development
work it may be more of a handicap than an assist. (I.e., it would tend
to focus effort on meeting the definition rather on where the program
should logically next be developed.)
P.S.: I meant an arbitrary week. If it can only handle certain weeks,
then it is clearly either not that intelligent, or has been poorly
educated. (However, I have a rather lower opinion than many of the
amount of intelligence exhibited by humans, tending more toward a belief
that they operate largely on reflexes and evolved rather than chosen
goals. Consider, e.g., the number of people not who start to believe in
astrology, but rather who continue to believe in it for years. A simple
examination of predictions will demonstrate that nothing significant was
predicted in advance, but only explained afterwards. [OTOH, it *was*
once useful for determining when to plant which crops.])
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07