On Tue, Feb 3, 2015 at 11:10 PM, Stathis Papaioannou <stath...@gmail.com>
wrote:

> On 4 February 2015 at 12:18, Jason Resch <jasonre...@gmail.com> wrote:
>
> >> What could such a test even look like?
> >
> >
> > Determining whether the brain or CPU of the supposedly conscious entity
> was
> > performing computations or processing information in a manner consistent
> > with those processes that according to some theory are conscious.
> >
> > Here's an example: do you think information theory can be used to prove a
> > certain thing is not consciouss in certain ways? E.g., if some quale
> > contains at least 2 GB of information in it, then any process too simple
> to
> > have 2 GB worth of information could not manifest that particular quale?
> > After all, you don't worry that the bacteria that die when you wash your
> > hands have human or God-like consciousness? It seems then information
> theory
> > provides at least some tools to measure (or at least bound) possible
> > conscious states of systems.
>
> Those criteria are suggestive, but they don't prove the presence of
> consciousness. It is like saying that the problem of other minds is
> proved by the fact that other brains are similar to mine, and if I'm
> conscious, they probably are too. It is suggestive, but it is not
> proof.
>

There are never any proofs in the sciences, just accumulations of evidence.
I don't propose a consciousness test here on this list, but I at least show
there may be a path to a test of of an upper bound on consciousness, if the
size of the informational state can be measured (and people seem more
readily accepting of the fact that information storage and carrying
capacity can be objectively measured and studied). Proving something is
conscious seems trickier than proving something is not conscious, but a
similar sort of coherent information processing capacity type of
measurement might be possible (as proposed by Tononi
http://en.wikipedia.org/wiki/Integrated_information_theory ).



>
> >> > I do follow what your reasoning that (no possible test for
> >> > consciousness) ->
> >> > (epiphenominalism), but I use that reasoning to take the position that
> >> > (not
> >> > epiphenominalism) -> (not no possible test for consciousness). Hence
> >> > there
> >> > should be a test for consciousness under the assumption that
> >> > epiphenomenalism is false. (Which it seems to be because we can talk
> >> > about
> >> > consciousness, also thought experiments like dancing/fading qualia
> lend
> >> > further support to consciousness being detectible and having
> detectible
> >> > influences on behavior, see: http://consc.net/papers/qualia.html ).
> >>
> >> I don't see that those thought experiments claim to make consciousness
> >> detectable. What they show is that IF an entity is conscious THEN its
> >> consciousness will be preserved if a functionally equivalent
> >> substitution is made in the entity. This is consistent with
> >> epiphenomenalism - the consciousness emerges necessarily from the
> >> right sort of behaviour.
> >
> >
> > But if epiphenonalism is true, you could never know whether consciousness
> > emerged or not (even if the right sort of behavior was present). The
> theory
> > offers no motivations for accepting it, other than to hide the problem of
> > explaining consciousness under the rug where it may be conveniently
> > forgotten.
>
> If I have epiphenomenal consciousness, then others with similar brains
> and behaviour probably also have epiphenomenal consciousness.
>

But for all you know, you are the lone person on Earth that has the right
gene that enables you to be conscious. Since consciousness is not
detectable (according to epiphenominalism), there would be no way to tell
which gene of yours conferred this ability.


>
> >> If it were not so, then in theory you could
> >> make a component that was functionally equivalent, but lacked
> >> consciousness (or lacked a consciousness-enabling property), which
> >> would allow the creation of partial zombies, which I believe are
> >> absurd.
> >
> >
> > Epiphenominalism implies full zombies are plausable. If full zombies are
> > plausible, then why wouldn't partial zombies be plausible?
>
> Epiphenomenalism implies that full zombies are impossible.
>

Maybe if you define them to be impossible, but they remain logically
possible since there's no physical causal need for consciousness to exist
or not, it can be dispensed with entirely without altering the course of
physics.


> > If on the other hand, you think zombies (full or partial) are absurd, and
> > ascribe to a theory where consciousnes always results given the right
> sorts
> > of behavior (under some theory), then can't detection of those right
> sorts
> > of behavior be used as a test of consciousness? Surely, we must always
> doubt
> > the theory under which we are operating, and we can no more prove other
> > beings are conscious than we can prove the outside world is real, but if,
> > (say under theory X) that we can prove that a process that implements a
> > certain set of computations Y conscious of certain things Z, then under
> > theory X and computationalism, any other process that implements the
> > computations Y will be conscious of Z.
>
> Yes, except I would leave out the requirement that we prove that
> computations Y are conscious of Z, because I don't thiunk it's
> possible to prove. Instead, I would say that *given* that computations
> Y are conscious of Z.
>
>
I only said it could be proved "under theory X". We can't prove theory X is
true, but we can perhaps prove that under theory X certain things follow
under that model.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to