Sure, but that is not mentioned in the article really.
You discuss lossless compression, and say lossy compression would only give a 
small gain in benefit.

Waht you argued in your reply is NOT 
Compression is Equivalent to General  Intelligenceinstead it is simply, 
"Compression is good at text prediction"

Further you "argue that ideal text compression, if it were possible, would be 
equivalent to passing the Turing test for artificial intelligence (AI)."

Using this theory your AI Turing bot would just spit out the Most common 
answer/text for anything.

So if I tell it simply that I am a boy.
then next tell It I am a girl, 
it has no possible way of responding in any realistic manner,
because it has no internal representation, or "thoguhts" on the matter of the 
dialogue.
Likewise it could not ever be an AGI because it has no 
motivator/planner/decider/reasoner.

There is nothing to it but text.

It could possibly be a good tool or knowledge base for a AGI to reference, but 
it is not intelligent in any way other than a encyclopedia is intelligent, in 
that it is useful to an intelligent agent.

One last point, is a basic premise of computer science, that compression is NOT 
always good, as seen in many ways.
  1. speed - we have the ability to compress video and data files very small, 
but we find that when we need to display or show them that we have to upack and 
make them useful again.  And with the insane rate of growth of storage space, 
its just cheaper to make more, and more, we cant yet fill any storage space up 
with useful knowledge anyway.
  2. access - Google and many others have massive amounts of redundancy. If I 
have something stored in one spot and in another, I can act in a much more 
intelligent fasion.  An index to an encyclopedia, adds NO extra world knowledge 
to it, but it gives me a leg up on finding the information in a different 
fashion.
  Similarly, if I put in a wiki article that Poison ivy causes a rash under the 
poison ivy article, and under the rashes article, a user could access it from 
two different way. This is necessary.

James Ratcliff

Matt Mahoney <[EMAIL PROTECTED]> wrote: Again, do not confuse the two 
compressions.

In paq8f (on which paq8hp5 is based) I use lossy pattern recognition (like you 
describe, but at a lower level) to extract features to use as context for text 
prediction.  The lossless compression is used to evaluate the quality of the 
prediction.
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: James Ratcliff <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Thursday, November 16, 2006 1:41:41 PM
Subject: Re: [agi] A question on the symbol-system hypothesis

The main first subtitle: 
Compression is Equivalent to General  IntelligenceUnless your definition of 
"Compression" is not the simple large amount of text turning into the small 
amount of text.
And likewise with General Intelligence.
I dont think under any of the many many definitions I have seen or created, 
that "text" or a compress "thing" can possibly be considered general 
intelligence.
Another way: data != knowledge != intelligence

Intelligence requires "something" else.  I would say an actor.

Now I would agree that a highly compressed, lossless data could represent a 
"good" knowledge base.  Yeah that goes good.

But quite simply, a lossy one provides a Better knowledge base, with two 
examples:
1. Poison ivy causes an itching rash for most people
poison oak: The common effect is an irritating, itchy rash.
Can be generalized or combined to:
poison  oak and poison ivy cause an itchy rash.
Which is shorter, and lossy yet better for this fact.
2. If I see something in the road with four legs, and Im about to run it over, 
if I only have rules that say if a deer or dog runs in the road, dont hit it.
Then I cant correctly act, because I only know there is something with 4 legs 
in the road.  
However, if I have a generalized rule in my mind that says 
If something with four legs is in the road, avoid it, then I have a "better" 
rule.
This better rule cannot be gathered without generalization, and we have to have 
lots of generalization.

The generalizations can be invalidated with exceptions, and we do it all the 
time, thats how we can tell not to pet a skunk instead of a cat.

James Ratcliff


Matt Mahoney <[EMAIL PROTECTED]> wrote: Richard  Loosemore  wrote:
> 5) I have looked at your paper and my feelings are exactly the same as 
> Mark's .... theorems developed on erroneous assumptions are worthless.

Which assumptions are erroneous?
 
-- Matt Mahoney, [EMAIL PROTECTED]

----- Original Message ----
From: Richard Loosemore 
To: agi@v2.listbox.com
Sent: Wednesday, November 15, 2006 4:09:23 PM
Subject: Re: [agi] A question on the symbol-system hypothesis

Matt Mahoney wrote:
> Richard, what is your definition of "understanding"?  How would you test 
> whether a person understands art?
> 
> Turing offered a behavioral test for intelligence.  My understanding of 
> "understanding" is that it is something that requires intelligence.  The 
> connection between intelligence and compression is not obvious.  I have 
> summarized the arguments here.
> http://cs.fit.edu/~mmahoney/compression/rationale.html

1)  There will probably never be a compact definition of "understanding". 
  Nevertheless, it is possible for us (being understanding systems) to 
know some of its features.  I could produce a shopping list of typical 
features of understanding, but that would not be the same as a 
definition, so I will not.  See my paper in the forthcoming proceedings 
of the 2006 AGIRI workshop, for arguments.  (I will make a version of 
this available this week, after final revisions).

3) One tiny, almost-too-obvious-to-be-worth-stating fact about 
understanding is that it compresses information in order to do its job.

4) To mistake this tiny little facet of understanding for the whole is 
to say that a hurricane IS rotation, rather than that rotation is a 
facet of what a hurricane is.

5) I have looked at your paper and my feelings are exactly the same as 
Mark's .... theorems developed on erroneous assumptions are  worthless.



Richard Loosemore


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303



_______________________________________
James Ratcliff - http://falazar.com
New Torrent Site, Has TV and Movie Downloads! 
http://www.falazar.com/projects/Torrents/tvtorrents_show.php   
---------------------------------
Sponsored Link

    Mortgage rates as low as 4.625% - $150,000 loan for $579 a month. 
Intro-*Terms
 This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or 
change your options, please go to: http://v2.listbox.com/member/?list_id=303 




---------------------------------
 This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or 
change your options, please go to: http://v2.listbox.com/member/?list_id=303 


_______________________________________
James Ratcliff - http://falazar.com
New Torrent Site, Has TV and Movie Downloads! 
http://www.falazar.com/projects/Torrents/tvtorrents_show.php
 
---------------------------------
Sponsored Link

Mortgage rates near 39yr lows. $420,000 Mortgage for $1,399/mo - Calculate new 
house payment

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to