The main point being consciousness effects multi-agent collective intelligence. Theoretically it could be used to improve a goal of compression since compression and intelligence are related though compression seems more narrow, or attempting to compress that is.
Either way this is not nonsense. Contemporary compression has yet to get very close to max theoretical so exploring the space of potential mechanisms, especially intelligence related facets like consciousness and multi-agent consciousness can be potential candidates for a new hack? I think though that attempting to get close to max compression is not as related to a goal of an efficient compression... John From: Matt Mahoney [mailto:matmaho...@yahoo.com] Sent: Tuesday, December 30, 2008 8:47 AM To: agi@v2.listbox.com Subject: RE: [agi] Universal intelligence test benchmark John, So if consciousness is important for compression, then I suggest you write two compression programs, one conscious and one not, and see which one compresses better. Otherwise, this is nonsense. -- Matt Mahoney, matmaho...@yahoo.com --- On Tue, 12/30/08, John G. Rose <johnr...@polyplexic.com> wrote: From: John G. Rose <johnr...@polyplexic.com> Subject: RE: [agi] Universal intelligence test benchmark To: agi@v2.listbox.com Date: Tuesday, December 30, 2008, 9:46 AM If the agents were p-zombies or just not conscious they would have different motivations. Consciousness has properties of communication protocol and effects inter-agent communication. The idea being it enhances agents' existence and survival. I assume it facilitates collective intelligence, generally. For a multi-agent system with a goal of compression or prediction the agent consciousness would have to be catered. So introducing - Consciousness of X is: the idea or feeling that X is correlated with "Consciousness of X" to the agents would give them more "glue" if they expended that consciousness on one another. The communications dynamics of the system would change.... verses a similar non-conscious multi-agent system. John From: Ben Goertzel [mailto:b...@goertzel.org] Sent: Monday, December 29, 2008 2:30 PM To: agi@v2.listbox.com Subject: Re: [agi] Universal intelligence test benchmark Consciousness of X is: the idea or feeling that X is correlated with "Consciousness of X" ;-) ben g On Mon, Dec 29, 2008 at 4:23 PM, Matt Mahoney < <mailto:matmaho...@yahoo.com> matmaho...@yahoo.com> wrote: --- On Mon, 12/29/08, John G. Rose < <mailto:johnr...@polyplexic.com> johnr...@polyplexic.com> wrote: > > What does consciousness have to do with the rest of your argument? > > > > Multi-agent systems should need individual consciousness to > achieve advanced > levels of collective intelligence. So if you are > programming a multi-agent > system, potentially a compressor, having consciousness in > the agents could > have an intelligence amplifying effect instead of having > non-conscious > agents. Or some sort of primitive consciousness component > since higher level > consciousness has not really been programmed yet. > > Agree? No. What do you mean by "consciousness"? Some people use "consciousness" and intelligence" interchangeably. If that is the case, then you are just using a circular argument. If not, then what is the difference? -- Matt Mahoney, <mailto:matmaho...@yahoo.com> matmaho...@yahoo.com _____ agi | <https://www.listbox.com/member/archive/303/=now> Archives <https://www.listbox.com/member/archive/rss/303/> | <https://www.listbox.com/member/?&> Modify Your Subscription <http://www.listbox.com> _____ agi | <https://www.listbox.com/member/archive/303/=now> Archives <https://www.listbox.com/member/archive/rss/303/> | <https://www.listbox.com/member/?& > Modify Your Subscription <http://www.listbox.com> ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b Powered by Listbox: http://www.listbox.com