--- On Sat, 12/27/08, John G. Rose <johnr...@polyplexic.com> wrote:

> > > How does consciousness fit into your compression
> > > intelligence modeling?
> > 
> > It doesn't. Why is consciousness important?
> > 
> 
> I was just prodding you on this. Many people on this list talk about the
> requirements of consciousness for AGI and I was imagining some sort of
> consciousness in one of your command line compressors :) 
> I've yet to grasp
> the relationship between intelligence and consciousness though lately I
> think consciousness may be more of an evolutionary social thing. Home grown
> digital intelligence, since it is a loner, may not require "much"
> consciousness IMO..

What we commonly call consciousness is a large collection of features that 
distinguish living human brains from dead human brains: ability to think, 
communicate, perceive, make decisions, learn, move, talk, see, etc. We only 
attach significance to it because we evolved, like all animals, to fear a large 
set of things that can kill us.

> > > Max compression implies hacks, kludges and a
> large decompressor.
> > 
> > As I discovered with the large text benchmark.
> > 
> 
> Yep and the behavior of the metrics near max theoretical
> compression is erratic I think?

It shouldn't be. There is a well defined (but possibly not computable) limit 
for each of the well defined universal Turing machines that the benchmark 
accepts (x86, C, C++, etc).

I was hoping to discover an elegant theory for AI. It didn't quite work that 
way. It seems to be a kind of genetic algorithm: make random changes to the 
code and keep the ones that improve compression.

-- Matt Mahoney, matmaho...@yahoo.com



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com

Reply via email to