On 04-05-2022 22:24, Brent Meeker wrote:
On 5/4/2022 11:36 AM, smitra wrote:
On 03-05-2022 19:52, Brent Meeker wrote:
On 5/3/2022 5:00 AM, smitra wrote:
On 28-04-2022 07:24, Brent Meeker wrote:
On 4/26/2022 5:32 PM, smitra wrote:

On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:

On 24-04-2022 03:16, Bruce Kellett wrote:

A moment's thought should make it clear to you that this is not
possible. If both possibilities are realized, it cannot be the
case
that one has twice the probability of the other. In the long run,
if
both are realized they have equal probabilities of 1/2.

The probabilities do not have to be 1/2.  Suppose one million people


participate in a lottery such that there will be exactly one winner.

The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different


persons. If we then create one million almost identical persons, the


probability is still one one in a million. This means that the limit

of
identical persons, the probability will be one in a million.

Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?

Your lottery example is completely beside the point.

It provides for an example of a case where your logic does not apply.

I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and 11; after 3 trials, we have branches registering 000, 001, 011, 010,

100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.

After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and
standard
deviation that decreases as 1/sqrt(N). In other words, the majority
of
trials will have equal, or approximately equal, numbers of 0s and
1s.
Observers in these branches will naturally take the probability to
be
approximated by the relative frequencies of 0s and 1s. In other
words,
they will take the probability of each outcome to be 0.5.

The problem with this is that you just assume that all branches are
equally probable. You don't make that explicit, it's implicitly
assumed, but it's just an assumption. You are simply doing branch
counting.

But it shows why you can't use branch counting.  There's no physical
mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
into numbers of branches.  To implement that you have put it in "by
hand" that the branches have weights or numerousity of _a _and _b_.
This is possible, but it gives the lie to the MWI mantra of "It's just
the Schroedinger equation."


Yes, one has to interpret the wavefunction as giving probabilities. That's still better than assuming that the physical state evolves sometimes according to the Schrödinger equations and sometimes by undergoing a nondeterministic collapse without there being any evidence for such collapses, without even credible theoretical models for it.

Is there any evidence that is NOT from collapse?  How does it get
recorded?  Where is it?  A credible theoretical model is one that
predicts the observed result...not necessarily one that satisfies your metaphysical prejudices.  You seem to have adopted a Platonist view of
physics.  But as Sean Carroll (a proponent of MWI) remarked, "But all
human progress has come from studying the shadows on the wall."


A theoretical model cannot be tied to macroscopic concepts that are known to only give an effective description of nature.

But that's not "known".  It's only "known" if you assume the
theoretical model...circular reasoning.


If collapse is not effective but a real effect not due to decoherence, then there is as of yet no experimental evidence for it.

It's just like concepts in thermodynamics that can be explained in a more fundamental way using statistical physics. No one objects to doing that on the grounds of any practical impossibility of building molecular-scale heat engines.

But the consequences of thermodynamics are confirmed by observation. 
MWI puts them where they are, in principle, unobservable.


Real collapse would have clear observational consequences. There is no experimental evidence for collapse. A real collapse would also violate QM, despite being part of the postulated of QM as traditionally formulated, due to the Schrödinger equation not being universally valid.

Saibal

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/da555048ee9c4e53e9dd5db1a577e773%40zonnet.nl.

Reply via email to