John,
So if consciousness is important for compression, then I suggest you write two 
compression programs, one conscious and one not, and see which one compresses 
better. 

Otherwise, this is nonsense.

-- Matt Mahoney, matmaho...@yahoo.com

--- On Tue, 12/30/08, John G. Rose <johnr...@polyplexic.com> wrote:
From: John G. Rose <johnr...@polyplexic.com>
Subject: RE: [agi] Universal intelligence test benchmark
To: agi@v2.listbox.com
Date: Tuesday, December 30, 2008, 9:46 AM




 
 






If the agents were p-zombies or just not conscious they would have
different motivations. 

   

Consciousness has properties of communication protocol and effects
inter-agent communication. The idea being it enhances agents' existence and
survival. I assume it facilitates collective intelligence, generally. For a
multi-agent system with a goal of compression or prediction the agent
consciousness would have to be catered.  So introducing -  

Consciousness of X is: the idea or feeling that X is
correlated with "Consciousness of X"

to
the agents would give them more "glue" if they expended that consciousness
on one another. The communications dynamics of the system would change.... 
verses
a similar non-conscious multi-agent system. 

   

John 

   







From: Ben Goertzel
[mailto:b...@goertzel.org] 

Sent: Monday, December 29, 2008 2:30 PM

To: agi@v2.listbox.com

Subject: Re: [agi] Universal intelligence test benchmark 





   



Consciousness of X is: the idea or feeling that X is correlated with
"Consciousness of X"



;-)



ben g 



On Mon, Dec 29, 2008 at 4:23 PM, Matt Mahoney <matmaho...@yahoo.com> wrote: 



--- On Mon, 12/29/08, John G.
Rose <johnr...@polyplexic.com>
wrote: 





> > What does
consciousness have to do with the rest of your argument?

> >

>

> Multi-agent systems should need individual consciousness to

> achieve advanced

> levels of collective intelligence. So if you are

> programming a multi-agent

> system, potentially a compressor, having consciousness in

> the agents could

> have an intelligence amplifying effect instead of having

> non-conscious

> agents. Or some sort of primitive consciousness component

> since higher level

> consciousness has not really been programmed yet.

>

> Agree? 



No. What do you mean by "consciousness"?



Some people use "consciousness" and intelligence"
interchangeably. If that is the case, then you are just using a circular
argument. If not, then what is the difference? 





-- Matt Mahoney, matmaho...@yahoo.com







 













  
    
      
      agi | Archives

 | Modify
 Your Subscription


      
    
  


 




-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com

Reply via email to