On Mon, May 9, 2022 at 12:47 PM Brent Meeker <[email protected]> wrote:

> On 5/8/2022 5:39 PM, Bruce Kellett wrote:
>
> On Mon, May 9, 2022 at 10:32 AM Brent Meeker <[email protected]>
> wrote:
>
>> On 5/8/2022 5:25 PM, Bruce Kellett wrote:
>>
>> On Mon, May 9, 2022 at 10:17 AM Brent Meeker <[email protected]>
>> wrote:
>>
>>>
>>> I don't think that's a problem.  The number of information bits within a
>>> Hubble sphere is something like the area in Planck units, which already
>>> implies the continuum is a just a convenient approximation.  If the area is
>>> N then something order 1/N would be the smallest non-zero probability.
>>> Also there would be a cutoff for the off-diagonal terms of the density
>>> matrix.  Once all the off-diagonal terms are zero then it's like a mixed
>>> matrix and one could say that one of the diagonal terms has "happened".
>>>
>>
>> As I have pointed out before, a finite number of branches does not work
>> because after a certain finite number of splits, one would run out of
>> branches to partition in anything like the way appropriate for the related
>> probabilities. One cannot go adding more branches at that stage without
>> rendering the whole concept meaningless. Keeping things finite has its
>> attractions, but it does not work in this case.
>>
>>
>> I think it depends on how you count splits.  If the number of dof within
>> a Hubble volume is finite, then the number of splits doesn't grow
>> exponentially.  They get cut off when their probability becomes too small.
>>
>
> You are back to your notion of a smallest possible probability. That also
> runs into problems if you run a long sequence of events where one outcome
> has a very small probability on each trial. Try tossing a coin N times. The
> probability of a sequence of N heads is 1/N. What happens when this gets
> smaller than the smallest allowed probability? Is the next toss somehow
> forbidden to give head again? You are making the whole notion of
> probability problematic.
>
>
> Yes, I can see a concern.  But my back-of-the envelope estimate is that
> the Hubble volume has the information content of ~10^96 bits.  So it would
> very hard experimentally to flip enough coins to test that limit.  However
> it would imply that you couldn't create a pseudo-random number generator
> that could produce random numbers with that many bits.  That would raise
> the question of how would you tell?  The sequence of numbers of a good
> pseudo-random number generator look random until you test high order
> correlations.
>

I don't think that the limited number of bits of information in the Hubble
volume is much of a concern. I suspect that if the number of branches is
finite, and there is a limit to how small a probability can be, then
everything must be discrete -- space and time along with everything else.
Or else you get a Zeno effect with radioactive decay. For a long-lived
isotope, the probability of decay in a small time interval can be made as
small as you want by taking a small enough time interval. Whether this is
measurable is not really the issue. If there is a lower limit on
probability, then decays are probably impossible without discrete time and
space as well.

Bruce

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAFxXSLQmhv_PUROcK6_daPqQf%2BjPbrzqLkisf7sgnm7EjRWPHw%40mail.gmail.com.

Reply via email to