--- On Sat, 5/31/08, Ben Goertzel <[EMAIL PROTECTED]> wrote:

I wrote:
> > What internal properties of a Turing machine
> > distinguish one that has subjective experiences from an
> > equivalent machine (implementing the same function) that
> > only pretends to have subjective experience?
> 
> 
> You're asking a different question.
> 
> What I said was that internal properties could distinguish
> 
> a) a machine having HUMANLIKE subjective experiences
> 
> from
> 
> b) a machine just claiming to have HUMANLIKE subjective
> experiences, but not really having them

The reason I ask is because humans and human-like intelligence occupies a tiny 
region in the huge space of possible intelligence.  If only humanlike 
subjective experience is important, then the definition is easy.  Subjective 
experience is something humans have, and nothing else.  End of argument.

I am looking for a more general principle.  On SL4 I proposed that an agent A 
receiving sensory input x at time t has subjective experience

s(x|A) = K(A(t+)|A(t-))

where K is Kolmogorov complexity, A(t-) is the state of A immediately before 
input x and A(t+) is the state immediately afterwards.  In other words, it the 
length (in bits) of the shortest program that takes as input a description of A 
prior to input x and outputs a description of A after input x.  It measures how 
much A remembers about x, independent of K(x).

In your book you mentioned the "intensity" of a pattern and gave a definition 
that included a lossy compression term (i.e. the part of x that A ignored).  
Note that in general,

s(x|A) <= K(x|A(t-))

where the difference is the number of bits that A ignored.

This definition makes no distinction between having subjective experience and 
pretending to have it.  It also makes no distinction between humanlike 
subjective experience and any other kind.

By this definition, your computer can already have 10^12 bits of subjective 
experience, far more than the 10^9 bits of human long term memory estimated by 
Landauer.

I do not mean to imply any ethical considerations by this.  It is easy to 
confuse conscious entities (those having subjective experience) with entities 
that have rights or require compassion.  The human ethical model has no such 
requirement.  Ethics is an evolved function that selects for group fitness.  
You are compassionate to other humans because it increases the odds of passing 
on your genes.  If all conscious entities were worthy of compassion, we would 
not have wars or eat meat.  This leads us back to our original definition...


-- Matt Mahoney, [EMAIL PROTECTED]





-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to