Bill, Richard, etc,
Children don't have a great grasp of language, but they have all the sensory and contextual mechanisms to learn a language by causal interaction with their environment.  Semantics are a learned system, just as words are.  In current AI we're programming semantic rules into a huge neural database, and asking it to play an big matching game.  These two types of learning may give the same result, but it's not the same process by a long shot.  Every time we logically code an algorithm, we're only mimicking the logic function of a learned neural process, which doesn't allow the tiered complexity and concept grasping that sensory learning does.  

Because language uses discrete semantic rules, it's easy to fall into the trap of thinking computers, given enough horsepower, are capable of human thought.  Give a computer as many semantic algorithms, metaphor databases, and reaction grading mechanisms as you want, but it takes much deeper and differentiated networks to apply those words and derive a physical meaning beyond grammatical or metaphorical boundaries.  This is the difference between a system that resembles intelligence, and an intelligent system.  The resembling system is only capable of processing information based on algorithms, and not reworking an algorithm based on the reasoning for executing the function.  Whether our AGI is conscious or not, it could still be functionally equivalent to a human mind in terms of output. The recursive bidirectional nature of neurons and their relation to forming a gestalt is something we're barely able to grasp as a concept, let alone code for.  The nature of our hardware is going to have to change to accommodate these multidimensional and recursive problems in computing. 


Josh Treadwell
     Systems Administrator
        [EMAIL PROTECTED]
        direct:480.206.3776

C.R.I.S. Camera Services

250 North 54th Street
Chandler, AZ 85226 USA
p 480.940.1103 / f 480.940.1329
http://www.criscam.com


BillK wrote:
On 10/20/06, Richard Loosemore <[EMAIL PROTECTED]> wrote:

For you to blithely say "Most normal speaking requires relatively little
'intelligence'" is just mind-boggling.


I am not trying to say that language skills don't require a human
level of intelligence. That's obvious. That is what make humans human.

But day-to-day chat can be mastered by children, even in a foreign language.

Watch that video I referenced in my previous post, of an American
chatting to a Chinese woman via a laptop running MASTOR software.
<http://www.research.ibm.com/jam/speech_to_speech.mpg>

Now tell me that that laptop is showing great intelligence to
translate at the basic level of normal conversation. Simple subject
object predicate stuff. Basic everyday vocabulary.
No complex similes, metaphors, etc.

There is a big difference between discussing philosophy and saying
"Where is the toilet?"
That is what I was trying to point out.

Billk

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]


This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]
--
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.

Reply via email to