The meaning of probability and the origin of the Born rule has been seem as one of the outstanding problems for Everettian quantum theory. Attempts by Carroll and Sebens, and Zurek to derive the Born rule have considered probability in terms of the outcome of a single experiment where the result is uncertain. This approach is can be seen to be misguided, because probabilities cannot be determined from a single trial. In a single trial, any result is possible, in both the single-world and many-worlds cases -- probability is not manifest in single trials.

It is not surprising, therefore, the Carroll and Zurek have concentrated on the basic idea that equal amplitudes have equal probabilities, and have been led to break the original state (with unequal amplitudes) down to a superposition of many parts having equal amplitude, basically by looking to "borrow" ancillary degrees of freedom from the environment. The number of equal amplitude components then gives the relative probabilities by a simple process of branch counting. As Carroll and Sebens write:

"This route to the Born rule has a simple physical interpretation. Take the wave function and write it as a sum over orthonormal basis vectors with equal amplitudes for each term in the sum (so that many terms may contribute to a single branch). Then the Born rule is simply a matter of counting -- every term in that sum contributes an equal probability." (arxiv:1405.7907 [gr-qc])

Many questions remain as to the validity of this process, particularly as it involves an implementation of the idea of self-selection: of selecting which branch one finds oneself on. This is an even more dubious process than branch counting, since it harks back to the "many-minds" ideas of Albert and Loewer, which even David Albert now finds to be "bad, silly, tasteless, hopeless, and explicitly dualist."

Simon Saunders (in his article "Chance in the Everett Interpretation" (in  "Many Worlds: Everett, Quantum Theory, & Reality" Edited by Saunders, Barrett, Kent and Wallace (OUP 2010)) points out that probabilities can only be measured (or estimated) in a series of repeated trials, so it is only in sequences of repeated trials on an ensemble of similarly prepared states that we can see how probability emerges. This idea seemed promising, so I came up with the following argument.

If, in classical probability theory, one has a process in which the probability of success in a single Bernoulli trial is p, the probability of getting M successes in a sequence of N independent trials is p^M (1-p)^(N-M). Since there are many ways in which one could get M successes in N trials, to get the overall probability of M successes we have to sum over the N!/M!(N-M)! ways in which the M successes can be ordered. So the final probability of getting M successes in N independent trials is

      Prob of M successes = p^M (1-p)^(N-M) N!/M!(N-M)!.

We can find the value of p for which this probability is maximized by differentiating with respect to p and finding the turning point. A simple calculation gives that p = M/N maximizes this probability (or, alternatively, maximizes the amplitude for each sequence of results in the above sum). This is all elementary probability theory of the binomial distribution, and is completely uncontroversial.

If we now turn our attention to the quantum case, we have a measurement (or sequence of measurements) on a binary quantum state

     |psi> = a|0> + b|1>,

where |0> is to be counted as a "success", |1> represents anything else or a "fail", and a^2 + b^2 = 1. In a single measurement, we can get either |0> or 1>, (or we get both on separate branches in the Everettian case). Over a sequence of N similar trials, we get a set of 2^N sequences of all possible bit strings of length N. (These all exist in separate "worlds" for the Everettian, or simply represent different "possible worlds" (or possible sequences of results) in the single-world case.) This set of bit strings is independent of the coefficients 'a' and 'b' from the original state |psi>, but if we carry the amplitudes of the original superposition through the sequence of results, we find that for every zero in a bit string we get a factor of 'a', and for every one, we get a factor of 'b'.

Consequently, the amplitude multiplying any sequence of M zeros and (N-M) ones, is a^M b^(N-M). Again, differentiating with respect to 'a' to find the turning point (and the value of 'a' that maximizes this amplitude), we find

    |a|^2 = M/N,

where we have taken the modulus of 'a' since a is, in general, a complex number. Again, there will be more than one bit-string with exactly M zeros and (N-M) ones, and summing over these gives the additional factor of N!/M!(N-M)!, as above.

If we now compare the quantum result over N measurements with the classical probability result for N independent Bernoulli trials, we find that the amplitude for M successes in N trials is maximized when the modulus squared of the original amplitude equals the relative number of successes, M/N, which is the classical probability. Thus, the probability for measuring 'zero' in a single trial is just given by the modulus squared of the amplitude for that component of the original state. This is the Born rule.

Furthermore, if in the quantum case we square the amplitude for the sum over bit-strings with exactly M zeros, we get

    |a|^(2M) (1 - |a|^2)^(N-M) N!/M!(N-M)!,

which, with the identification |a|^2 = M/N, is just the probability for M successes in N Bernoulli trials with probability for success p = M/N, as above in the standard binomial case.

Now this result could be viewed as a derivation of the Born rule for the quantum case. Or it might be no more than a demonstration of the consistency of the Born rule, if it is already assumed. I am not sure either way. If nothing else, it demonstrates that the mod-squared amplitude plays the role of the probability in repeated measurements over an ensemble of similarly prepared quantum systems, given that one observes only one of the possible sequences of results, either as a single world or as one's particular 'relative state'.

Note also, that this argument is independent of any Everettian considerations -- one can always take a modal interpretation of the bit-strings other than the one actually observed, and see them as corresponding to 'other possible worlds', or sequences of results that could (counterfactually) have been obtained, but weren't. In other words, there is no necessity for the Everettian assumption that all sequences of results obtain in some actual world or the other. The modal single-world interpretation has the distinct advantage that it avoids the ugly locution that "low probability sequences certainly exist".

Bruce










--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/0db5dbce-f8ea-d42d-b3f5-a7d2cd166661%40optusnet.com.au.

Reply via email to