On 7/9/2024 7:16 AM, Stathis Papaioannou wrote:
Stathis Papaioannou
On Tue, 9 Jul 2024 at 22:15, Jason Resch <jasonre...@gmail.com> wrote:
On Tue, Jul 9, 2024, 4:33 AM Stathis Papaioannou
<stath...@gmail.com> wrote:
On Tue, 9 Jul 2024 at 04:23, Jason Resch
<jasonre...@gmail.com> wrote:
On Sun, Jul 7, 2024 at 3:14 PM John Clark
<johnkcl...@gmail.com> wrote:
On Sun, Jul 7, 2024 at 1:58 PM Jason Resch
<jasonre...@gmail.com> wrote:
/>>> // I think such foresight is a
necessary component of intelligence, not a
"byproduct"./
>>I agree, I can detect the existence of
foresight in others and so can natural
selection, and that's why we have it. It aids
in getting our genes transferred into the next
generation. But I was talking about
consciousness not foresight, and regardless of
how important we personally think
consciousness is, from evolution's point of
view it's utterly useless, and yet we have it,
or at least I have it.
/> you don't seem to think zombies are logically
possible,/
Zombies are possible, it's philosophical zombies,
a.k.a. smart zombies, that are impossible because it's
a brute fact that consciousness is the way data
behaves when it is being processed intelligently, or
at least that's what I think. Unless you believe that
all iterated sequences of "why" or "how" questions go
on forever then you must believe that brute facts
exist; and I can't think of a better candidate for one
than consciousness.
/> so then epiphenomenalism is false/
According to the InternetEncyclopedia of Philosophy
"/Epiphenomenalism is a position in the philosophy of
mind according to which mental states or events are
caused by physical states or events in the brain but
do not themselves cause anything/".If that is the
definition then I believe in Epiphenomenalism.
If you believe mental states do not cause anything, then
you believe philosophical zombies are logically possible
(since we could remove consciousness without altering
behavior).
Mental states could be necessarily tied to physical states
without having any separate causal efficacy, and zombies would
not be logically possible. Software is necessarily tied to
hardware activity: if a computer runs a particular program, it
is not optional that the program is implemented. However, the
software does not itself have causal efficacy, causing current
to flow in wires and semiconductors and so on: there is always
a sufficient explanation for such activity in purely physical
terms.
I don't disagree that there is sufficient explanation in all the
particle movements all following physical laws.
But then consider the question, how do we decide what level is in
control? You make the case that we should consider the quantum
field level in control because everything is ultimately reducible
to it.
But I don't think that's the best metric for deciding whether it's
in control or not. Do the molecules in the brain tell neurons what
do, or do neurons tell molecules what to do (e.g. when they fire)?
Or is it some mutually conditioned relationship?
Do neurons fire on their own and tell brains what to do, or do
neurons only fire when other neurons of the whole brain stimulate
them appropriately so they have to fire? Or is it again, another
case of mutualism?
When two people are discussing ideas, are the ideas determining
how each brain thinks and responds, or are the brains determining
the ideas by virtue of generating the words through which they are
expressed?
Through in each of these cases, we can always drop a layer and
explain all the events at that layer, that is not (in my view)
enough of a reason to argue that the events at that layer are "in
charge." Control structures, such as whole brain regions, or
complex computer programs, can involve and be influenced by the
actions of billions of separate events and separate parts, and as
such, they transcend the behaviors of any single physical particle
or physical law.
Consider: whether or not a program halts might only be
determinable by some rules and proof in a mathematical system, and
in this case no physical law will reveal the answer to that
physical system's (the computer's) behavior. So if higher level
laws are required in the explanation, does it still make sense to
appeal to the lower level (physical) laws as providing the
explanation?
Given the generality of computers, they can also simulate any
imaginable set of physical laws. In such simulations, again I
think appealing to our physical laws as explaining what happens in
these simulations is a mistake, as the simulation is organized in
a manner to make our physical laws irrelevant to the simulation.
So while you could explain what happens in the simulation in terms
of the physics of the computer running it, it adds no explanatory
power: it all cancels out leaving you with a model of the
simulated physics.
I would say that something has separate causal efficacy of its own if
physical events cannot be predicted without taking that thing into
account. For example, the trajectory of a bullet cannot be predicted
without taking the wind into account. In the brain, the trajectory of
an atom can be predicted without taking consciousness into account.
I think that's doubtful. Some atoms have their motion determined by
perceptions which are instantiated by things outside the brain and which
atom may depend on memory which depends on the whole history of the
organism.
Brent
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/6fb6efe2-82c8-4e29-9ffb-ace2e5977f0d%40gmail.com.