--- On Wed, 10/15/08, Dr. Matthias Heger <[EMAIL PROTECTED]> wrote:
> We need a difficult but well understood domain which is AGI-complete
> and as small as possible but not too small to avoid the  risk to build
> only AI instead of AGI. 

I have argued that text compression is just such a problem. Compressing natural 
language dialogs implies passing the Turing test. Compressing text containing 
mathematical expressions implies solving those expressions. Compression also 
allows for precise measurements of progress.

Text compression is not completely general. It tests language, but not vision 
or embodiment. Image compression is a poor test for vision because any progress 
in modeling high level visual features is overwhelmed by incompressible low 
level noise. Text does not have this problem.

Hutter proved that compression is a completely general solution in the AIXI 
model. (The best predictor of the environment is the shortest model that is 
consistent with observation so far). However, this may not be very useful as a 
test, because it would require testing over a random distribution of 
environmental models rather than problems of interest to people such as 
language.

-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to