On Fri, 8 Sep 2017 05:48 pm, Gregory Ewing wrote:

> Steve D'Aprano wrote:
>> A harder question is, what if you take a random number from the Integers? How
>> many digits will it have in (say) base 10? I don't have a good answer to
>> that. I think it may be ill-defined.
> 
> I think the answer is that on average it has infinitely many
> digits -- despite every actual integer only having finitely
> many digits!

I don't think that talking about the average integer is meaningful. Assuming we
are talking about the arithmetic mean, we're dealing with a divergent sum that
depends on the way you do the summation:

0 + 1 + -1 + 2 + -2 + 3 + -3 + ... = 0

0 + 1 + 2 + 3 + 4 + ... + (-1 - 2 - 3 - 4 - ...) = ∞ - ∞ which is undefined.

(Other means have similar problems, they're just harder to write or less
familiar.)

There's even a (legitimate!) argument to be made that the sum of all positive
integers is -1/12.

http://www.slate.com/blogs/bad_astronomy/2014/01/18/follow_up_the_infinite_series_and_the_mind_blowing_result.html

More here:

https://en.wikipedia.org/wiki/1_%2B_2_%2B_3_%2B_4_%2B_%E2%8B%AF

Not to mention the inconvenient fact that we're dividing by infinity:

(sum of all integers (whatever it is!))/∞

I don't think there is any way to select a random integer with equal
probability, but even if we had one, there's no guarantee that the sample means
would converge to the population mean. This is rather like the Cauchy
distribution, where the mean is not defined, and the sample means oscillate
more and more wildly as you sample more values.

So I think that any answer that requires talking about the mean or average is
ill-defined. At least unless we include significantly more rigour than I am
capable of.

If we instead say, "pick a random integer between 0 and N", and then let N
increase without limit, we see that the average number of digits also increases
without limit. But that's not the same as saying that the average number of
digits is infinite!

We *can* say that about choosing a random Real, because the irrational numbers
outnumber the rationals by so much that any random Real we pick is Almost
Always irrational. And irrationals don't have a finite expansion in any integer
base. Hence we can argue that we're almost certain to choose an irrational
number, and irrationals have infinite digits in their expansion.

But we can't say the same thing for integers. As you point out, all integers
have a finite number of digits, so we're on shaky ground to say that the
average integer has infinite digits. That implies that the average is bigger
than all the elements making up the average!

Trying to make sense of divergent series is fraught with traps. Many very simple
sounding questions involving divergent series don't have an answer at all.



-- 
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to