Craig Weinberg wrote:
> 
> On Oct 18, 10:00 am, benjayk <benjamin.jaku...@googlemail.com> wrote:
>> Craig Weinberg wrote:
>>
>> > Here’s a little thought experiment about free will. Let’s say that
>> > there exists a technology which will allow us to completely control
>> > another person’s neurology. What if two people use this technology to
>> > control each other? If one person started before the other, then they
>> > could effectively ‘disarm’ the others control over them preemptively,
>> > but what if they both began at the exact same time? Would one ‘win’
>> > control over the other somehow? Would either of them even be able to
>> > try to win? How would they know if they were controlling the other or
>> > being controlled to think they are controlling the other?
>>
>> Complete control over anything is simply impossible. Control is just a
>> feeling and not fundamental.
> 
> It depends what you mean by complete control. If I choose to hit the
> letter m on my keyboard, am I not controlling the keyboard to the
> extent that it is controllable?
> 
You can control everything to the extent that it is controllable for you,
obviously.
But you can't have control over the individual constituents of the keyboard
all at the same time in the exact way you want it. For the keyboard, you
don't need to, but the brain has no lever which you can use to make it do
what you want, because, contrary to the keyboard, it has not been designed
for that task - it is a holistic system, if you control a part of it
(sticking a electrode into you brain for example), it still won't do what
you want it to, as a whole.
So to control it, you'd have to do it on a broad scale and a fundamental
level. But we can't do that, and if someone could, the brain would just be a
puppet steered by a puppeter and as such it wouldn't be a brain as working
system, but rather a mass of flesh that is being manipulated.


Craig Weinberg wrote:
> 
>> The closest one can get to controlling the brain is to make it
>> dysfunctional. It's a bit boring, but the most realistic answer is that
>> both
>> would fall unconscious, as that is the only result of exerting excessive
>> control over a brain.
>> It's the same result as if you try to totally control an ecosystem, or an
>> economy. It'll destroy the natural order, as control is not a fundamental
>> ordering principle.
> 
> I generally agree. The thought experiment is to make people consider
> the fallacy of exclusively bottom up processing. I don't think that
> you could actually control a brain, I'm just saying that if you could,
> how do you get around the fact that it violates the assumption that
> only neurons can control the brain.
I don't think that many people would claim that. You probably mean that the
neurons control your behaviour, but I don't think many people believe that,
either. Materialist would rather claim that the neurons are the physical
cause for behaviour, and consciousness arises as a phenomenon alongside.
I don't see how this is any problem with regards to control, it just is a
claim of magic (mind coming out of non-mind, with no mechanism how this
could happen) that is not even directly subjectively validated (like the
magic of consciousness that we can directly witness).


Craig Weinberg wrote:
> 
>  The point was to show that bottom up exclusivity fails,
> and that  we must consider that our ordinary intuition of bi-
> directional, high-low processing interdependence may indeed be valid.
Yes, I guessed that this was your point, but I am not sure that your thought
experiment helps it. Neurons making thought is quite meaingless from the
start, I don't see how it is affected by what controls what.


Craig Weinberg wrote:
> 
>>
>> It seems like you think of control or will as something fundamental, and
>> I
>> don't see any reason to assume that it is.
> 
> That's a reasonable objection. If it's not fundamental, what is it
> composed of, and why is there an appearance of anything other than
> whatever that is?
It is not composed of anything (I am not a reductionist). Rather it arises
like other feelings/perceptions, for example being hungry (it is just more
essential to our identity).
The reason for its appearance is simply as a feedback mechanism, it shows us
that "we" are the source of the actions, which bring attention to our
actions (which is obviously quite useful). As such it is not more
fundamental than other such mechanism (like pain, which shows us something
is wrong in our body).
Also, in a state of "enlightenment", the feeling of being in control
vanishes (together with the ego that is supposed to be the controller), and
people still function normally, which shows that it can't be that
fundamental. It is an artifact of seeing yourself as a person, seperate from
your environment, and intervening in it. Actually it is quite a crude tool,
as many times we feel to be in control when the main cause lies in something
else (like gambling), and often we don't feel in control of essential
interventions into our environment (like reflexes).


Craig Weinberg wrote:
> 
>>Honestly I that we think that we
>> have "free", independent will is just the arrogance of our ego that feels
>> it
>> has to have a fundamentally special place in the universe.
> 
> I used to think that too, but now I see that it's every bit as much of
> an egotistical arrogance to De-anthropomorphize ourselves. It's an
> inverted, passive aggressive egotism to perpetually look to other
> processes above and below our native level of individual cohesion to
> give credit or blame, while all the while hiding invisibly behind the
> voyeur's curtain.
I understand where you coming from, but I don't see the necessary
relationship to will. We can be the genuine free source of our actions,
whether our will is free or not.
That we place so much attention on our will, is due to us seeing us as mere
doers. Actually we are much more (conscious beings with rich inner life), so
if anything it is humanizing to give more attention to that, than mere will.
It is important that we *are* free, not that we can decide "freely" what we
do in particular, if you ask me.


Craig Weinberg wrote:
> 
>  To think that we have no free will is to think that
> we cannot think one way or another that we have free will.
>  It's circular, self-negating reasoning. "I think that I don't really
> think,
> because I think that I can explain that it's not necessary for
> thinking to happen at all". Doesn't really make sense if you step out
> of the system and observer your thinking, opinionated, controlling
> self pronouncing that it controls nothing, thinks for no reason, and
> has opinions for...for what again? What is an opinion doing in a
> cosmos which has no free will? Literally. What does an opinion do? Why
> are you here talking to me? What is making controlling you to do this
> more than you yourself? Should I imagine that my neurons care what I
> think?
We can think different things, without this being fundamentally being
related to our will. We are just free to think, regardless whether we will
it to be so or not.
I think you are confusing free will and freedom. Your will can be an
expression of your freedom, but more often than not, we use our will to do
something which keeps us in bondage.

When I say that your will is not really free, I am not saying that you are a
puppet that is controlled by your brain. An opinion is valuable to you,
whether you just have it, or you claim to use your will to have it.
The cosmos does not need free will, as it is free without a will. It just
does what it does, including having opinions, talking to interesting people,
etc... Why is all of that nothing worth if there is no controller of them?
I mean, it is natural to want to be the owner of things (these are MY
actions), but we can also "learn" to transcend this, or rather, see that
there is no owner in the first place (just the appearance of one). I find
this liberating, not dehumanizing.


Craig Weinberg wrote:
> 
>> That is not to say that we are predetermined by a material universe,
>> rather
>> control is just a phenomenon arising in consciousness like all other
>> phenomena eg feelings and perceptions.
> 
> Sure, but that's all that it needs to be. As long as we get the
> sensory feedback that we expect from our motives, then we might as
> well have free will. It just seems violate parsimony unnecessarily.
> Why does it make sense for consciousness to be completely dominated by
> the experience of control in a universe where that would be utterly
> meaningless? How would such an illusion even work in the sense of how
> does a feeling of will get invented in the first place? If you keep
> throwing dice long enough they will start hallucinating that they are
> an organism with a conscious will? Why? How? It's totally nuts and
> explains nothing.
OK, I agree with you that it is not a meaningless by-product, certainly not.
That doesn't make it fundamental, though. It is fundamental to our
self-image, but that doesn't say much (money or fame is also, for some
people). Self-image is important in the development for consciousness, so it
makes sense it uses the feeling of being in control. But ultimately we don't
want to idolize an image, but actually be directly aware (of)/as the Self
(it seems to me there is just one).

benjayk
-- 
View this message in context: 
http://old.nabble.com/The-Overlords-Gambit-tp32662974p32677139.html
Sent from the Everything List mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to