People do not predict the next words of text. We anticipate it, but when
something different shows up, we accept it if it is *explanatory*. Using
compression like algorithms though will never be able to do this type of
explanatory reasoning, which is required to disambiguate text. It is
certainly not sufficient for learning language, which is not at all about
predicting text.

On Tue, Jun 29, 2010 at 3:38 PM, Matt Mahoney <matmaho...@yahoo.com> wrote:

> Experiments in text compression show that text alone is sufficient for
> learning to predict text.
>
> I realize that for a machine to pass the Turing test, it needs a visual
> model of the world. Otherwise it would have a hard time with questions like
> "what word in this ernai1 did I spell wrong"? Obviously the easiest way to
> build a visual model is with vision, but it is not the only way.
>
>
> -- Matt Mahoney, matmaho...@yahoo.com
>
>
> ------------------------------
> *From:* David Jones <davidher...@gmail.com>
> *To:* agi <agi@v2.listbox.com>
> *Sent:* Tue, June 29, 2010 3:22:33 PM
>
> *Subject:* Re: [agi] A Primary Distinction for an AGI
>
> I certainly agree that the techniques and explanation generating algorithms
> for learning language are hard coded into our brain. But, those techniques
> alone are not sufficient to learn language in the absence of sensory
> perception or some other way of getting the data required.
>
> Dave
>
> On Tue, Jun 29, 2010 at 3:19 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>
>> David Jones wrote:
>> >  The knowledge for interpreting language though should not be
>> pre-programmed.
>>
>> I think that human brains are wired differently than other animals to make
>> language learning easier. We have not been successful in training other
>> primates to speak, even though they have all the right anatomy such as vocal
>> chords, tongue, lips, etc. When primates have been taught sign language,
>> they have not successfully mastered forming sentences.
>>
>>
>> -- Matt Mahoney, matmaho...@yahoo.com
>>
>>
>> ------------------------------
>> *From:* David Jones <davidher...@gmail.com>
>> *To:* agi <agi@v2.listbox.com>
>> *Sent:* Tue, June 29, 2010 3:00:09 PM
>>
>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>
>> The point I was trying to make is that an approach that tries to interpret
>> language just using language itself and without sufficient information or
>> the means to realistically acquire that information, *should* fail.
>>
>> On the other hand, an approach that tries to interpret vision with minimal
>> upfront knowledge needs *should* succeed because the knowledge required to
>> automatically learn to interpret images is amenable to preprogramming. In
>> addition, such knowledge must be pre-programmed. The knowledge for
>> interpreting language though should not be pre-programmed.
>>
>> Dave
>>
>> On Tue, Jun 29, 2010 at 2:51 PM, Matt Mahoney <matmaho...@yahoo.com>wrote:
>>
>>> David Jones wrote:
>>> > I wish people understood this better.
>>>
>>> For example, animals can be intelligent even though they lack language
>>> because they can see. True, but an AGI with language skills is more useful
>>> than one without.
>>>
>>> And yes, I realize that language, vision, motor skills, hearing, and all
>>> the other senses and outputs are tied together. Skills in any area make
>>> learning the others easier.
>>>
>>>
>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>
>>>
>>>  ------------------------------
>>> *From:* David Jones <davidher...@gmail.com>
>>> *To:* agi <agi@v2.listbox.com>
>>> *Sent:* Tue, June 29, 2010 1:42:51 PM
>>>
>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>
>>> Mike,
>>>
>>> THIS is the flawed reasoning that causes people to ignore vision as the
>>> right way to create AGI. And I've finally come up with a great way to show
>>> you how wrong this reasoning is.
>>>
>>> I'll give you an extremely obvious argument that proves that vision
>>> requires much less knowledge to interpret than language does. Let's say that
>>> you have never been to egypt, you have never seen some particular movie
>>> before.  But if you see the movie, an alien landscape, an alien world, a new
>>> place or any such new visual experience, you can immediately interpret it in
>>> terms of spacial, temporal, compositional and other relationships.
>>>
>>> Now, go to egypt and listen to them speak. Can you interpret it? Nope.
>>> Why?! Because you don't have enough information. The language itself does
>>> not contain any information to help you interpret it. We do not learn
>>> language simply by listening. We learn based on evidence from how the
>>> language is used and how it occurs in our daily lives. Without that
>>> experience, you cannot interpret it.
>>>
>>> But with vision, you do not need extra knowledge to interpret a new
>>> situation. You can recognize completely new objects without any training
>>> except for simply observing them in their natural state.
>>>
>>> I wish people understood this better.
>>>
>>> Dave
>>>
>>> On Tue, Jun 29, 2010 at 12:51 PM, Mike Tintner <tint...@blueyonder.co.uk
>>> > wrote:
>>>
>>>>  Just off the cuff here - isn't the same true for vision? You can't
>>>> learn vision from vision. Just as all NLP has no connection with the real
>>>> world, and totally relies on the human programmer's knowledge of that 
>>>> world.
>>>>
>>>>
>>>> Your visual program actually relies totally on your visual "vocabulary"
>>>> - not its own. That is the inevitable penalty of processing unreal signals
>>>> on a computer screen which are not in fact connected to the real world any
>>>> more than the verbal/letter signals involved in NLP are.
>>>>
>>>> What you need to do - what anyone in your situation with anything like
>>>> your asprations needs to do - is to hook up with a roboticist. Everyone 
>>>> here
>>>> should be doing that.
>>>>
>>>>
>>>>  *From:* David Jones <davidher...@gmail.com>
>>>> *Sent:* Tuesday, June 29, 2010 5:27 PM
>>>> *To:* agi <agi@v2.listbox.com>
>>>> *Subject:* Re: [agi] A Primary Distinction for an AGI
>>>>
>>>> You can't learn language from language without embedding way more
>>>> knowledge than is reasonable. Language does not contain the information
>>>> required for its interpretation. There is no *reason* to interpret the
>>>> language into any of the infinite possible interpretaions. There is nothing
>>>> to explain but it requires explanatory reasoning to determine the correct
>>>> real world interpretation
>>>>
>>>> On Jun 29, 2010 10:58 AM, "Matt Mahoney" <matmaho...@yahoo.com> wrote:
>>>>
>>>>  David Jones wrote:
>>>> > Natural language requires more than the words on the page in the real
>>>> world. Of...
>>>> Any knowledge that can be demonstrated over a text-only channel (as in
>>>> the Turing test) can also be learned over a text-only channel.
>>>>
>>>>
>>>>
>>>> > Cyc also is trying to store knowledge about a super complicated world
>>>> in simplistic forms and al...
>>>> Cyc failed because it lacks natural language. The vast knowledge store
>>>> of the internet is unintelligible to Cyc. The average person can't use it
>>>> because they don't speak Cycl and because they have neither the ability nor
>>>> the patience to translate their implicit thoughts into augmented first 
>>>> order
>>>> logic. Cyc's approach was understandable when they started in 1984 when 
>>>> they
>>>> had neither the internet nor the vast computing power that is required to
>>>> learn natural language from unlabeled examples like children do.
>>>>
>>>>
>>>>
>>>> > Vision and other sensory interpretaion, on the other hand, do not
>>>> require more info because that...
>>>> Without natural language, your system will fail too. You don't have
>>>> enough computing power to learn language, much less the million times more
>>>> computing power you need to learn to see.
>>>>
>>>>
>>>>
>>>>
>>>> -- Matt Mahoney, matmaho...@yahoo.com
>>>>
>>>>  ________________________________
>>>> From: David Jones <davidher...@gmail.com>
>>>> To: agi <a...@v2.listbox.c...
>>>> *Sent:* Mon, June 28, 2010 9:28:57 PM
>>>>
>>>>
>>>> Subject: Re: [agi] A Primary Distinction for an AGI
>>>>
>>>>
>>>> Natural language requires more than the words on the page in the real
>>>> world. Of course that didn't ...
>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>>
>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>>
>>>
>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>> <http://www.listbox.com>
>>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>>> <https://www.listbox.com/member/archive/rss/303/> | 
>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>> <http://www.listbox.com>
>>>
>>
>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/> | 
>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>> <http://www.listbox.com>
>>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/> | 
>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>> <http://www.listbox.com>
>>
>
>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>    *agi* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/> | 
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=8660244-6e7fb59c
Powered by Listbox: http://www.listbox.com

Reply via email to