Re: Is Artificial Life Conscious?

2022-05-03 Thread spudboy100 via Everything List
Beyond my intellectual pay scale Jason. So far, nobody has developed a Turing 
passable machine that knocks us down with it's effectiveness to pass as a human 
"soul."  I would be happy to let humans be human and instead, & amp up our 
technological capabilities via machine intelligence.Thus, making wonderful 
medicines, and anti-pollution systems, and keep the conversations from human to 
human. For neurobiology I suppose I know what I read. :-(   Beyond this, for me 
it's akin to postulating whether there is a multiverse and if it is initiated 
by Everett's MW, or Linde (and company) Eternal Inflation? 
So the other shoe needs to be dropped: Do we get a choice in this?  If we do, 
can we travel back and forth for trade missions to either clone earths, or 
entirely different inhabited worlds unrelated to being copies and variations? 
If we are conscious do we get a choice with this over that? imitating, via 
complex computer processes that imitate or emulate what spindle cells do might 
make machinery conscious, maybe? Should this, will this get a budget? 


-Original Message-
From: Jason Resch 
To: Everything List 
Sent: Mon, May 2, 2022 7:18 pm
Subject: Re: Is Artificial Life Conscious?



On Mon, May 2, 2022 at 3:39 PM spudboy100 via Everything List 
 wrote:

I had read that spindle cells delineate consciousness, according to 
neurobiologists. Anyone see anything different?



Spindle neurons are very large cells, with their fibers stretching long enough 
to connect distant brain regions.
I would think then, an equally valid explanation of spindle neurons is they are 
a necessary adaptation in any creature with a sufficiently large brain.
Since we tend to associate consciousness with complex behaviors, and complex 
behaviors are often associated with animals that have large brains, I think may 
account for the correlation between the presumed consciousness of other species 
and presence of spindle neurons in those species' brains.
At least, I think this is a reasonable alternative explanation.
Jason -- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CA%2BBCJUjNiiPu%2B-Zi%2BvbYJt7nmL874jFiAiF5WKvdtViSYY0CXg%40mail.gmail.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/135297630.2390177.1651621799152%40mail.yahoo.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread Bruce Kellett
On Tue, May 3, 2022 at 10:11 PM smitra  wrote:

> On 28-04-2022 07:51, Bruce Kellett wrote:
> > On Thu, Apr 28, 2022 at 3:24 PM Brent Meeker 
> > wrote:
> >
> >> On 4/26/2022 5:32 PM, smitra wrote:
> >>
> >>> On 27-04-2022 01:37, Bruce Kellett wrote:
> >> Changing the weights of the components in the superposition does not
> >> change the conclusion of most observers that the actual probabilities
> >> are 0.5 for each result. This is simple mathematics, and I am amazed
> >> that even after all these years, and all the times I have spelled this
> >> out, you still seek to deny the obvious result. Your logical and
> >> mathematical skill are on a par with those of John Clark.
> >>
> >> It's indeed simple mathematics. You apply that to branch counting to
> >> arrive at the result of equal probabilities.
> >
> > I have not used branch counting. Please stop accusing me of that.
> >
>
> You are considering each branch to have an equal probability when there
> is no logical reason to do so, and when that's also being contradicted
> by QM.
>

I have not introduced any concept of probability. The 2^N branches that are
constructed when both outcomes are realized on each of N Bernoulli trials
are all on the same basis. There is no probability involved. The branches
are all equivalent by construction.

I think you are being confused by the presence of coefficients in the
expansion of the original state: the a and b in

  |psi> = a|0> + b|1>

The linearity of the Schrodinger equation means that the coefficients, a
and b, play no part in the construction of the 2^N possible branches; you
get the same set of 2^N branches whatever the values of a and b. Think of
it this way. If a = sqrt(0.9) and b = sqrt(0.1), the Born rule probability
for |0> is 90%, and the Born rule probability for |1> is 10%. But, by
hypothesis, both outcomes occur with certainty on each trial. There is a
conflict here. You cannot rationally have a 10% probability for something
that is certain to happen. This is why some people have resorted to the
idea that there are in fact an infinite number of branches, both before and
after the measurement. What the measurement does is partition these
branches in the ratio of the Born probabilities. But this is just a
suggestion. There is nothing in the Schrodinger equation, or in quantum
mechanics itself, that would suggest that there are an infinite number of
branches. In fact, that idea introduces a raft of problems of its own --
what is the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?

You are concerned that a collapse introduces unknown physics outside the
Schrodinger equation. You will have to be careful that your own solution
does not introduce even more outrageous physics outside the Schrodinger
equation. Collapse, after all, has a perfectly reasonable mechanism in
terms of the flashes of relativistic GRW theory.

My conclusion from this is that Everett (and MWI) is inconsistent with the
Born rule. So your idea of QM without collapse but with the Born rule, is
simply incoherent. There can be no such theory that is internally
consistent.




> >>> So, the conclusion has to be that one should not do branch
> >>> counting. The question is then if this disproves the MWI. If by
> >>> MWI we mean QM minus collapse then clearly not. Because in that
> >>> case we use the Born rule to compute the probability of outcomes
> >>> and assume that after a measurement we have different sectors for
> >>> observers who have observed the different outcomes with the
> >>> probabilities as given by the Born rule.
> >
> > In which case the Born rule is just an additional arbitrary
> > assumption: it is not part of the Schrodinger equation. Your theory of
> > QM minus collapse is not well-defined. You simply take whatever you
> > want from text-book quantum mechanics, with no regard to the
> > consistency of your model.
> >
>
> QM includes the Born rule. QM minus collapse is just that: QM minus
> collapse. It's not QM minus collapse minus the Born rule.
>
>
> >>> You then want to argue against that by claiming that your argument
> >>> applies generally and would not allow one to give different
> >>> sectors unequal probabilities. But that's nonsense, because you
> >>> make the hidden assumption of equal probabilities right from the
> >>> start.
> >
> > I simply assume the Schrodinger equation. Then, following Everett, we
> > take it to be deterministic, so that all branches occur on every
> > trial. Since it is deterministic, there is no concept of probability
> > inherent in the Schrodinger equation, and I do not assume any
> > definition of probability. So the branches occur as they occur, there
> > is no assumption of equal probability. It is just that the
> > construction means that  all 2^N branches occur on the same basis and
> > necessarily count equally in the overall b

Entanglement and Superposition Are Equivalent Concepts in Any Physical Theory

2022-05-03 Thread Dirk Van Niekerk
May be of interest:

Entanglement and Superposition Are Equivalent Concepts in Any Physical 
Theory

ABSTRACT

We prove that given any two general probabilistic theories (GPTs) the 
following are equivalent: (i) each theory is nonclassical, meaning that 
neither of their state spaces is a simplex; (ii) each theory satisfies a 
strong notion of incompatibility equivalent to the existence of 
“superpositions”; and (iii) the two theories are entangleable, in the sense 
that their composite exhibits either entangled states or entangled 
measurements. Intuitively, in the post-quantum GPT setting, a superposition 
is a set of two binary ensembles of states that are unambiguously 
distinguishable if the ensemble is revealed before the measurement has 
occurred, but not if it is revealed after. This notion is important because 
we show that, just like in quantum theory, superposition in the form of 
strong incompatibility is sufficient to realize the Bennett-Brassard 1984 
protocol for secret key distribution.


https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.128.160402

Free access preprint:
https://arxiv.org/abs/2109.04446

Dirk

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/c0ace8be-b212-4334-bc5c-af1533f5d0e0n%40googlegroups.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread Brent Meeker




On 5/3/2022 5:00 AM, smitra wrote:

On 28-04-2022 07:24, Brent Meeker wrote:

On 4/26/2022 5:32 PM, smitra wrote:


On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra  wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people


participate in a lottery such that there will be exactly one winner.

The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different


persons. If we then create one million almost identical persons, the


probability is still one one in a million. This means that the limit

of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?


Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.


I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,

100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
standard
deviation that decreases as 1/sqrt(N). In other words, the majority
of
trials will have equal, or approximately equal, numbers of 0s and
1s.
Observers in these branches will naturally take the probability to
be
approximated by the relative frequencies of 0s and 1s. In other
words,
they will take the probability of each outcome to be 0.5.


The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
counting.

But it shows why you can't use branch counting.  There's no physical
mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
into numbers of branches.  To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's just
the Schroedinger equation."



Yes, one has to interpret the wavefunction as giving probabilities. 
That's still better than assuming that the physical state evolves 
sometimes according to the Schrödinger equations and sometimes by 
undergoing a nondeterministic collapse without there being any 
evidence for such collapses, without even credible theoretical models 
for it.


Is there any evidence that is NOT from collapse?  How does it get 
recorded?  Where is it?  A credible theoretical model is one that 
predicts the observed result...not necessarily one that satisfies your 
metaphysical prejudices.  You seem to have adopted a Platonist view of 
physics.  But as Sean Carroll (a proponent of MWI) remarked, "But all 
human progress has come from studying the shadows on the wall."


Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/f1974025-3c0e-52e0-1d5e-24b27b19d16a%40gmail.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread Brent Meeker




On 5/3/2022 4:48 AM, smitra wrote:

On 28-04-2022 07:23, Brent Meeker wrote:

On 4/27/2022 10:38 AM, smitra wrote:

On 27-04-2022 04:08, Brent Meeker wrote:

On 4/26/2022 5:32 PM, smitra wrote:


On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra  wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people


participate in a lottery such that there will be exactly one winner.

The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different


persons. If we then create one million almost identical persons, the


probability is still one one in a million. This means that the limit

of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?


Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.


I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,

100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
standard
deviation that decreases as 1/sqrt(N). In other words, the majority
of
trials will have equal, or approximately equal, numbers of 0s and
1s.
Observers in these branches will naturally take the probability to
be
approximated by the relative frequencies of 0s and 1s. In other
words,
they will take the probability of each outcome to be 0.5.


The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
counting.

But it shows why you can't use branch counting.  There's no physical
mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
into numbers of branches.  To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's just
the Schroedinger equation."



The problem is with giving a physical interpretation to the 
mathematics here. If we take MWI to be QM without collapse, then we 
have not specified anything about branches yet. Different MWI 
advocates have published different ideas about this, and they can't 
all be right. But at heart MWI is just QM without collapse. To 
proceed in a rigorous way, one has to start with what counts as a 
branch. It seems to me that this has to involve the definition of an 
observer, and that requires a theory about what observation is. 
I.m.o, this has to be done by defining an observer as an algorithm, 
but many people think that you need to invoke environmental 
decoherence. People like e.g. Zurek using the latter definition have 
attempted to derive the Born rule based on that idea.


I.m.o., one has to start working out a theory based on rigorous 
definitions and then see where that leads to, instead of arguing 
based on vague, ill defined notions.


"Observer as an algorithm" seems pretty ill defined to me. Which
algorithm?  applied to what input?  How does the algorithm, a Platonic
construct, interface with the physical universe? Decoherence seems
much better defined.  And so does QBism.

Any human observer is arguably implemented by an algorithm run by a 
brain. 


Plus sensors, plus environmentyou call that "well defined"??

So, for any given observer at some time, there exists a precisely 
defined algorithm that defines that observer. In practice we cannot 
provide for any such definition, but from the point of view of the 
theory, it's important to takr into acc

Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread Brent Meeker




On 5/3/2022 4:40 AM, smitra wrote:

On 28-04-2022 02:14, Brent Meeker wrote:

On 4/27/2022 2:00 PM, smitra wrote:


If you agree, and are prepared,
with me, to throw out Everett, then we agree, and there is nothing

more to be argued about (at least, until you present some
different
complete theory).

I'm open to the idea that QM itself may only be an approximation to
a more fundamental theory. The arguments in favor of no collapse are
strong arguments but you then do get this issue with probability
that you have discussed here. The disagreement with you about this
is that I  don't see it as a fatal inconsistency that would prove
the MWI to be wrong. Probabilities for the different branches do not
have to be equal. But that doesn't mean that this looks to be a
rather unnatural feature of the theory. This suggests that a more
fundamental theory exists from which one could derive quantum
mechanics with its formalism involving amplitudes and the Born rule
as an approximation.


If there are probabilities attached to the branches, then Gleason's
theorem shows that the probabilities must satisfy the Born rule.  So I
don't seen any inconsistency in simply saying they are probabilities
of measurement results,  that's Copenhagen.  But if they are
probabilities of results that implies that some things happen and
others don't...other wise what does "probability" mean and what use is
it as an empirical concept?  That brings back the original problem of
CI, where and how is this happening defined?



If there are 3 copies of an observer and 2 experience outcome A and 1 
experiences outcome B then the probability of the observer 
experiencing outcome B is 1/3. 


That doesn't even parse.  There is no THE observer.  The probability of 
any one of the three experiencing B is 1/3.  The Borel set is AAB, ABA, BAA.


Here we should note that the personal identity of an observer is 
determined by all the information in the brain and is therefore 
different from the different outcomes. So, we always have (slightly) 
different observers observing different things, which is not all that 
different from starting with 3 different people of whom 2 experience 
outcome A and 1 experiences outcome B.


In which case you  have to say that if I choose one of the three persons 
with equal probability, the probability of that I choose the one who 
experienced B is 1/3.  The the Borel set is the set of choices...not the 
set of persons.


Brent



Saibal



Brent

 --
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/7954277d-8375-0340-a5f7-b42d7d514fdb%40gmail.com 


[1].


Links:
--
[1]
https://groups.google.com/d/msgid/everything-list/7954277d-8375-0340-a5f7-b42d7d514fdb%40gmail.com?utm_medium=email&utm_source=footer 





--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/83033348-7098-e80d-88a6-f4e290ad4a47%40gmail.com.


RE: Is Artificial Life Conscious?

2022-05-03 Thread Philip Benjamin
[Philip Benjamin]
 The question: "If simple creatures like worms or insects are conscious, 
(because they 
 have brains, and evolved), then wouldn't these artificial life forms be 
conscious for the same reasons? " is irrelevant. Simple creatures reproduce. 
Will robots reproduce? Baby robots? Do they have a desire for and grow on the 
pablum of metal powder and vaseline? Simple creatures trans-speciated from what 
? Worms evolve into worms? The oldest fossils found are algae and bacteria.  
Still the same type of bacteria and algae today!! 
Philip Benjamin
Nonconformist to Marxist-Socialist pagan globalism of the WAMP.  
-Original Message-
From: everything-list@googlegroups.com  On 
Behalf Of Russell Standish
Sent: Monday, May 2, 2022 4:30 AM
To: Everything List 
Subject: Re: Is Artificial Life Conscious?

On Fri, Apr 22, 2022 at 09:38:40PM -0500, Jason Resch wrote:
> Artificial Life such as these organisms: 
> Have neural networks that evolved through natural selection, can adapt 
> to a changing environment, and can learn to distinguish between "food" and 
> "poison"
> in their environment.
> 
> If simple creatures like worms or insects are conscious, (because they 
> have brains, and evolved), then wouldn't these artificial life forms 
> be conscious for the same reasons?
> 
> Why or why not?

Most insects can't be consious (see my paper "Ants are not conscious"). Most 
ALife forms created to date are simpler than insects, and probably even worms, 
so are unlikely to be consious either.


-- 


Dr Russell StandishPhone 0425 253119 (mobile)
Principal, High Performance Coders hpco...@hpcoders.com.au
  
https://nam12.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.hpcoders.com.au%2F&data=05%7C01%7C%7C994979c7169d4376c94208da2c1e5fc1%7C84df9e7fe9f640afb435%7C1%7C0%7C637870806212880403%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=83WOv896FmcadjcAn%2BRPvGHnwrlUeOB6oOVPL8u9zXU%3D&reserved=0


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://nam12.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Feverything-list%2F20220502093004.GA16990%2540zen&data=05%7C01%7C%7C994979c7169d4376c94208da2c1e5fc1%7C84df9e7fe9f640afb435%7C1%7C0%7C637870806212880403%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=a72uDHfrFb87RtfQdXsntOZ4uVHin80s5PfHX5YlBEU%3D&reserved=0.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/SJ0PR14MB5264BB1093520B9ECF592843A8C09%40SJ0PR14MB5264.namprd14.prod.outlook.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread smitra

On 03-05-2022 14:11, Bruce Kellett wrote:

On Tue, May 3, 2022 at 9:40 PM smitra  wrote:


On 28-04-2022 02:14, Brent Meeker wrote:

On 4/27/2022 2:00 PM, smitra wrote:


If you agree, and are prepared,
with me, to throw out Everett, then we agree, and there is

nothing


more to be argued about (at least, until you present some
different complete theory).

I'm open to the idea that QM itself may only be an approximation

to

a more fundamental theory. The arguments in favor of no collapse

are

strong arguments but you then do get this issue with probability
that you have discussed here. The disagreement with you about

this

is that I  don't see it as a fatal inconsistency that would prove
the MWI to be wrong. Probabilities for the different branches do

not

have to be equal. But that doesn't mean that this looks to be a
rather unnatural feature of the theory. This suggests that a more
fundamental theory exists from which one could derive quantum
mechanics with its formalism involving amplitudes and the Born

rule

as an approximation.


If there are probabilities attached to the branches, then

Gleason's

theorem shows that the probabilities must satisfy the Born rule.

So I

don't seen any inconsistency in simply saying they are

probabilities

of measurement results,  that's Copenhagen.  But if they are
probabilities of results that implies that some things happen and
others don't...other wise what does "probability" mean and what

use is

it as an empirical concept?  That brings back the original problem

of

CI, where and how is this happening defined?



If there are 3 copies of an observer and 2 experience outcome A and
1
experiences outcome B then the probability of the observer
experiencing
outcome B is 1/3. Here we should note that the personal identity of
an
observer is determined by all the information in the brain and is
therefore different from the different outcomes. So, we always have
(slightly) different observers observing different things, which is
not
all that different from starting with 3 different people of whom 2
experience outcome A and 1 experiences outcome B.


That's just branch counting, which is known not to work.



The complete physical state is not in doubt in this case. In your 
argument you apply your reasoning to QM but you remove the information 
about the amplitudes from the wavefunction so you replace QM by a Straw 
Man version of QM that then fails to describe the real world correctly.


While MWI is QM minus collapse, what you do is consider MWI minus 
collapse minus Born rule, then argue that this doesn't work and that 
therefore the MWI is wrong.


Saibal


Bruce

 --
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/CAFxXSLQy0WgCuwwkv6%2B-z6H5o1r3OgZRZywq%3Di0zEUXaoC-MHw%40mail.gmail.com
[1].


Links:
--
[1]
https://groups.google.com/d/msgid/everything-list/CAFxXSLQy0WgCuwwkv6%2B-z6H5o1r3OgZRZywq%3Di0zEUXaoC-MHw%40mail.gmail.com?utm_medium=email&utm_source=footer


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/403fa7da3f45d1ce42ea41e42f16680c%40zonnet.nl.


Re: Is Artificial Life Conscious?

2022-05-03 Thread John Clark
On Mon, May 2, 2022 at 7:06 PM Russell Standish 
wrote:


> *> Hi John, always a pleasure to cross swords with your brain :).*


Greetings Russell, and I feel the same way you do; or at least I'm pretty
sure I do, but there is always a bit of uncertainty when determining the
conscious state of another human being.

* > I know that I am conscious. Therefore I must be a member of the set of
> consious entities. It is true I don't know what else is in the set.*


Mathematical proofs demand absolute certainty, and If you demand absolute
certainty the possibility that the set of conscious entities contains only
one member cannot be excluded by any logical argument. But of course in our
everyday lives we never encounter absolute certainty nor do we need it,
except when we're taking a calculus examination.
>
>
* > I do assume that all humans are conscious*


I assume the same thing for 2 reasons:

1) The evidence is overwhelming that Charles Darwin was right, thus
Evolution produced me and I am conscious, but evolution can NOT directly
see consciousness anymore then we can directly see consciousness in others,
because consciousness alone, regardless of how much we may value it, can
confer no reproductive advantage, and that's all Evolution cares about.
However, Evolution most certainly CAN see intelligent behavior. The only
thing that is compatible with all this is that consciousness is the
inevitable byproduct of intelligence, so it must be a brute fact that
consciousness is the way data feels when it is being processed
intelligently.  A corollary of this would be that the Turing Test works
just as well for consciousness as it does for intelligence. It's far from
perfect but the Turing Test is the only tool we have to investigate
consciousness.

2) I simply could not function unless I assumed I was not the only
conscious being in the universe.

> *(at least at some point in their lives),*


Yes, neither of us believes that our fellow human beings are conscious when
they're sleeping, or under anesthesia, or dead, and for the same reason,
when they are in those states they just don't behave very intelligently.
And that's why I'm interested in AI and intelligence research, but I'm not
interested in consciousness research. And that's also why consciousness
research has not advanced an inch, or even a nanometer, in a 1000 years.

>
> *but if you assume the opposite, then the argument is even stronger.*


Assuming the opposite would be assuming that everything is always conscious
regardless of its behavior, so even rocks are conscious, even electrons.

*> No - it is a deduction. You're reading the abstract. It is usual to
> state the conclusion in the abstract so you know whether it is worth
> digging into the paper body to see the proof.*


It takes time to carefully read a scientific paper, and so the abstract was
invented to give a reader just enough information to decide if reading the
entire paper is worth their time. Your abstract makes clear that the
conclusion that insects are not conscious is based on "*finding oneself a
member of a particular reference class of conscious beings*" with the
implicit assumption the set contains more than one member. I concede that
if one makes that assumption then it might not be unreasonable to conclude
that insects are not conscious (although I see no reason to believe that
consciousness is an all or nothing matter) , but now you admit you "*don't
know what else is in the set"* of conscious beings. And determining what
else is in that set is exactly what this entire controversy is all about.

John K ClarkSee what's on my new list at  Extropolis

ifq

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv0QSNnLQX2AdVagi8ixAOsLA57xgqMmJwDXkomsW4MszQ%40mail.gmail.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread smitra

On 28-04-2022 07:51, Bruce Kellett wrote:

On Thu, Apr 28, 2022 at 3:24 PM Brent Meeker 
wrote:


On 4/26/2022 5:32 PM, smitra wrote:


On 27-04-2022 01:37, Bruce Kellett wrote:

Changing the weights of the components in the superposition does
not
change the conclusion of most observers that the actual
probabilities
are 0.5 for each result. This is simple mathematics, and I am amazed

that even after all these years, and all the times I have spelled
this
out, you still seek to deny the obvious result. Your logical and
mathematical skill are on a par with those of John Clark.

It's indeed simple mathematics. You apply that to branch counting to
arrive at the result of equal probabilities.


I have not used branch counting. Please stop accusing me of that.



You are considering each branch to have an equal probability when there 
is no logical reason to do so, and when that's also being contradicted 
by QM.

'

So, the conclusion has to be that one should not do branch
counting. The question is then if this disproves the MWI. If by
MWI we mean QM minus collapse then clearly not. Because in that
case we use the Born rule to compute the probability of outcomes
and assume that after a measurement we have different sectors for
observers who have observed the different outcomes with the
probabilities as given by the Born rule.


In which case the Born rule is just an additional arbitrary
assumption: it is not part of the Schrodinger equation. Your theory of
QM minus collapse is not well-defined. You simply take whatever you
want from text-book quantum mechanics, with no regard to the
consistency of your model.



QM includes the Born rule. QM minus collapse is just that: QM minus 
collapse. It's not QM minus collapse minus the Born rule.




You then want to argue against that by claiming that your argument
applies generally and would not allow one to give different
sectors unequal probabilities. But that's nonsense, because you
make the hidden assumption of equal probabilities right from the
start.


I simply assume the Schrodinger equation. Then, following Everett, we
take it to be deterministic, so that all branches occur on every
trial. Since it is deterministic, there is no concept of probability
inherent in the Schrodinger equation, and I do not assume any
definition of probability. So the branches occur as they occur, there
is no assumption of equal probability. It is just that the
construction means that  all 2^N branches occur on the same basis and
necessarily count equally in the overall branching picture.



Why do they necessarily count equally? What is the meaning of the 
wavefunction? Why don't the amplitudes matter?




There is nothing in QM that says that branches must count equally,
and the lottery example I gave makes it clear that you can have
branching with unequal probabilities in classical physics.


As I have said, there is no classical analogue of an interaction in
which all outcomes necessarily occur. So your lottery example is
useless. There is no concept of probability involved in any of this.



The lottery example I gave clearly is a classical example in which all 
outcomes necessarily occur. Your reasoning does not involve any QM at 
all, you just apply it to the MWI. Your argument goes through also in 
case of the lottery example, in which case it leads to an obviopusly 
wrong conclusion. So, it's your reasoning that's at fault not the MWI 
taken to be QM minus collapse.


Saibal


Bruce





Yes, there's nothing in QM that says the branches must count
equally.  But there's also nothing in the evolution of Schroedingers
equation that they must count as _a^2_ and _b^2_.  Of course IF they
are probabilities then it follows from Gleason's theorem that they
follow the Born rule.  But in that case you have reintroduced almost
all the philosophical problems of the Copenhagen interpretation.
When exactly does this splitting occur?  Can the split be into
irrational numbers of branches?  A splitting is in some particular
basis and not in other bases.  What determines the pointer basis?

Brent


 --
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/CAFxXSLTBdJpDkw_duZDuvMvArLte-3OoxJcs8-3vXjroKSti8g%40mail.gmail.com
[1].


Links:
--
[1]
https://groups.google.com/d/msgid/everything-list/CAFxXSLTBdJpDkw_duZDuvMvArLte-3OoxJcs8-3vXjroKSti8g%40mail.gmail.com?utm_medium=email&utm_source=footer


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/1170d90b620e4

Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread Bruce Kellett
On Tue, May 3, 2022 at 9:40 PM smitra  wrote:

> On 28-04-2022 02:14, Brent Meeker wrote:
> > On 4/27/2022 2:00 PM, smitra wrote:
> >
> >>> If you agree, and are prepared,
> >>> with me, to throw out Everett, then we agree, and there is nothing
> >>>
> >>> more to be argued about (at least, until you present some
> >>> different complete theory).
> >> I'm open to the idea that QM itself may only be an approximation to
> >> a more fundamental theory. The arguments in favor of no collapse are
> >> strong arguments but you then do get this issue with probability
> >> that you have discussed here. The disagreement with you about this
> >> is that I  don't see it as a fatal inconsistency that would prove
> >> the MWI to be wrong. Probabilities for the different branches do not
> >> have to be equal. But that doesn't mean that this looks to be a
> >> rather unnatural feature of the theory. This suggests that a more
> >> fundamental theory exists from which one could derive quantum
> >> mechanics with its formalism involving amplitudes and the Born rule
> >> as an approximation.
> >
> > If there are probabilities attached to the branches, then Gleason's
> > theorem shows that the probabilities must satisfy the Born rule.  So I
> > don't seen any inconsistency in simply saying they are probabilities
> > of measurement results,  that's Copenhagen.  But if they are
> > probabilities of results that implies that some things happen and
> > others don't...other wise what does "probability" mean and what use is
> > it as an empirical concept?  That brings back the original problem of
> > CI, where and how is this happening defined?
> >
>
> If there are 3 copies of an observer and 2 experience outcome A and 1
> experiences outcome B then the probability of the observer experiencing
> outcome B is 1/3. Here we should note that the personal identity of an
> observer is determined by all the information in the brain and is
> therefore different from the different outcomes. So, we always have
> (slightly) different observers observing different things, which is not
> all that different from starting with 3 different people of whom 2
> experience outcome A and 1 experiences outcome B.
>

That's just branch counting, which is known not to work.

Bruce

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAFxXSLQy0WgCuwwkv6%2B-z6H5o1r3OgZRZywq%3Di0zEUXaoC-MHw%40mail.gmail.com.


Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread smitra

On 28-04-2022 07:24, Brent Meeker wrote:

On 4/26/2022 5:32 PM, smitra wrote:


On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra  wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people


participate in a lottery such that there will be exactly one winner.

The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different


persons. If we then create one million almost identical persons, the


probability is still one one in a million. This means that the limit

of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?


Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.


I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,

100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
standard
deviation that decreases as 1/sqrt(N). In other words, the majority
of
trials will have equal, or approximately equal, numbers of 0s and
1s.
Observers in these branches will naturally take the probability to
be
approximated by the relative frequencies of 0s and 1s. In other
words,
they will take the probability of each outcome to be 0.5.


The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
counting.

But it shows why you can't use branch counting.  There's no physical
mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
into numbers of branches.  To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's just
the Schroedinger equation."



Yes, one has to interpret the wavefunction as giving probabilities. 
That's still better than assuming that the physical state evolves 
sometimes according to the Schrödinger equations and sometimes by 
undergoing a nondeterministic collapse without there being any evidence 
for such collapses, without even credible theoretical models for it.




Brent


The important point to notice is that this result of all possible
binary sequences for N trials is independent of the coefficients
in
the binary expansion of the state:

.

Changing the weights of the components in the superposition does
not
change the conclusion of most observers that the actual
probabilities
are 0.5 for each result. This is simple mathematics, and I am
amazed
that even after all these years, and all the times I have spelled
this
out, you still seek to deny the obvious result. Your logical and
mathematical skill are on a par with those of John Clark.


It's indeed simple mathematics. You apply that to branch counting to
arrive at the result of equal probabilities. So, the conclusion has
to be that one should not do branch counting. The question is then
if this disproves the MWI. If by MWI we mean QM minus collapse then
clearly not. Because in that case we use the Born rule to compute
the probability of outcomes and assume that after a measurement we
have different sectors for observers who have observed the different
outcomes with the probabilities as given by the Born rule.

You then want to argue against that by claiming that your argument
applies generally and would not allow one to give different sectors
unequal probabilities. But that's nonsense, because you make the
hidden assumption of equal probabiliti

Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread smitra

On 28-04-2022 07:23, Brent Meeker wrote:

On 4/27/2022 10:38 AM, smitra wrote:

On 27-04-2022 04:08, Brent Meeker wrote:

On 4/26/2022 5:32 PM, smitra wrote:


On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra  wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people


participate in a lottery such that there will be exactly one winner.

The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different


persons. If we then create one million almost identical persons, the


probability is still one one in a million. This means that the limit

of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?


Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.


I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,

100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
standard
deviation that decreases as 1/sqrt(N). In other words, the majority
of
trials will have equal, or approximately equal, numbers of 0s and
1s.
Observers in these branches will naturally take the probability to
be
approximated by the relative frequencies of 0s and 1s. In other
words,
they will take the probability of each outcome to be 0.5.


The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
counting.

But it shows why you can't use branch counting.  There's no physical
mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
into numbers of branches.  To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's 
just

the Schroedinger equation."



The problem is with giving a physical interpretation to the 
mathematics here. If we take MWI to be QM without collapse, then we 
have not specified anything about branches yet. Different MWI 
advocates have published different ideas about this, and they can't 
all be right. But at heart MWI is just QM without collapse. To proceed 
in a rigorous way, one has to start with what counts as a branch. It 
seems to me that this has to involve the definition of an observer, 
and that requires a theory about what observation is. I.m.o, this has 
to be done by defining an observer as an algorithm, but many people 
think that you need to invoke environmental decoherence. People like 
e.g. Zurek using the latter definition have attempted to derive the 
Born rule based on that idea.


I.m.o., one has to start working out a theory based on rigorous 
definitions and then see where that leads to, instead of arguing based 
on vague, ill defined notions.


"Observer as an algorithm" seems pretty ill defined to me.  Which
algorithm?  applied to what input?  How does the algorithm, a Platonic
construct, interface with the physical universe? Decoherence seems
much better defined.  And so does QBism.

Any human observer is arguably implemented by an algorithm run by a 
brain. So, for any given observer at some time, there exists a precisely 
defined algorithm that defines that observer. In practice we cannot 
provide for any such definition, but from the point of view of the 
theory, it's important to takr into account the way an observer should 
be rigorously defined.


Decoherence should be irrelevant to this issue.

Re: The Nature of Contingency: Quantum Physics as Modal Realism

2022-05-03 Thread smitra

On 28-04-2022 02:14, Brent Meeker wrote:

On 4/27/2022 2:00 PM, smitra wrote:


If you agree, and are prepared,
with me, to throw out Everett, then we agree, and there is nothing

more to be argued about (at least, until you present some
different
complete theory).

I'm open to the idea that QM itself may only be an approximation to
a more fundamental theory. The arguments in favor of no collapse are
strong arguments but you then do get this issue with probability
that you have discussed here. The disagreement with you about this
is that I  don't see it as a fatal inconsistency that would prove
the MWI to be wrong. Probabilities for the different branches do not
have to be equal. But that doesn't mean that this looks to be a
rather unnatural feature of the theory. This suggests that a more
fundamental theory exists from which one could derive quantum
mechanics with its formalism involving amplitudes and the Born rule
as an approximation.


If there are probabilities attached to the branches, then Gleason's
theorem shows that the probabilities must satisfy the Born rule.  So I
don't seen any inconsistency in simply saying they are probabilities
of measurement results,  that's Copenhagen.  But if they are
probabilities of results that implies that some things happen and
others don't...other wise what does "probability" mean and what use is
it as an empirical concept?  That brings back the original problem of
CI, where and how is this happening defined?



If there are 3 copies of an observer and 2 experience outcome A and 1 
experiences outcome B then the probability of the observer experiencing 
outcome B is 1/3. Here we should note that the personal identity of an 
observer is determined by all the information in the brain and is 
therefore different from the different outcomes. So, we always have 
(slightly) different observers observing different things, which is not 
all that different from starting with 3 different people of whom 2 
experience outcome A and 1 experiences outcome B.


Saibal



Brent

 --
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/7954277d-8375-0340-a5f7-b42d7d514fdb%40gmail.com
[1].


Links:
--
[1]
https://groups.google.com/d/msgid/everything-list/7954277d-8375-0340-a5f7-b42d7d514fdb%40gmail.com?utm_medium=email&utm_source=footer


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/47a0d7555b9fc67d69040a2b9cc9b181%40zonnet.nl.