Again, do not confuse the two compressions.

In paq8f (on which paq8hp5 is based) I use lossy pattern recognition (like you 
describe, but at a lower level) to extract features to use as context for text 
prediction.  The lossless compression is used to evaluate the quality of the 
prediction.
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: James Ratcliff <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Thursday, November 16, 2006 1:41:41 PM
Subject: Re: [agi] A question on the symbol-system hypothesis

The main first subtitle: 
Compression is Equivalent to General IntelligenceUnless your definition of 
"Compression" is not the simple large amount of text turning into the small 
amount of text.
And likewise with General Intelligence.
I dont think under any of the many many definitions I have seen or created, 
that "text" or a compress "thing" can possibly be considered general 
intelligence.
Another way: data != knowledge != intelligence

Intelligence requires "something" else.  I would say an actor.

Now I would agree that a highly compressed, lossless data could represent a 
"good" knowledge base.  Yeah that goes good.

But quite simply, a lossy one provides a Better knowledge base, with two 
examples:
1. Poison ivy causes an itching rash for most people
poison oak: The common effect is an irritating, itchy rash.
Can be generalized or combined to:
poison
 oak and poison ivy cause an itchy rash.
Which is shorter, and lossy yet better for this fact.
2. If I see something in the road with four legs, and Im about to run it over, 
if I only have rules that say if a deer or dog runs in the road, dont hit it.
Then I cant correctly act, because I only know there is something with 4 legs 
in the road.  
However, if I have a generalized rule in my mind that says 
If something with four legs is in the road, avoid it, then I have a "better" 
rule.
This better rule cannot be gathered without generalization, and we have to have 
lots of generalization.

The generalizations can be invalidated with exceptions, and we do it all the 
time, thats how we can tell not to pet a skunk instead of a cat.

James Ratcliff


Matt Mahoney <[EMAIL PROTECTED]> wrote: Richard
 Loosemore  wrote:
> 5) I have looked at your paper and my feelings are exactly the same as 
> Mark's .... theorems developed on erroneous assumptions are worthless.

Which assumptions are erroneous?
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: Richard Loosemore 
To: agi@v2.listbox.com
Sent: Wednesday, November 15, 2006 4:09:23 PM
Subject: Re: [agi] A question on the symbol-system hypothesis

Matt Mahoney wrote:
> Richard, what is your definition of "understanding"?  How would you test 
> whether a person understands art?
> 
> Turing offered a behavioral test for intelligence.  My understanding of 
> "understanding" is that it is something that requires intelligence.  The 
> connection between intelligence and compression is not obvious.  I have 
> summarized the arguments here.
> http://cs.fit.edu/~mmahoney/compression/rationale.html

1)
 There will probably never be a compact definition of "understanding". 
  Nevertheless, it is possible for us (being understanding systems) to 
know some of its features.  I could produce a shopping list of typical 
features of understanding, but that would not be the same as a 
definition, so I will not.  See my paper in the forthcoming proceedings 
of the 2006 AGIRI workshop, for arguments.  (I will make a version of 
this available this week, after final revisions).

3) One tiny, almost-too-obvious-to-be-worth-stating fact about 
understanding is that it compresses information in order to do its job.

4) To mistake this tiny little facet of understanding for the whole is 
to say that a hurricane IS rotation, rather than that rotation is a 
facet of what a hurricane is.

5) I have looked at your paper and my feelings are exactly the same as 
Mark's .... theorems developed on erroneous assumptions are
 worthless.



Richard Loosemore


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



_______________________________________
James Ratcliff - http://falazar.com
New Torrent Site, Has TV and Movie Downloads! 
http://www.falazar.com/projects/Torrents/tvtorrents_show.php 

Sponsored Link

   
Mortgage rates as low as 4.625% - $150,000 loan for $579 a month. Intro-*Terms

This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to