On Fri, Dec 04, 2015 at 01:37:06PM -0500, Andrei Alexandrescu via Digitalmars-d 
wrote:
> On 12/04/2015 11:06 AM, Andrei Alexandrescu wrote:
[...]
> Here's the algebra.
> 
> Terms:
> 
> 1 = O(1)
> log = O(log n)
> plog = O((log n)^k)
> sqrt = O(sqrt n)
> lin = O(n)
> linlog = O(n * log n)
> linplog = O(n * (log n)^k)
> poly = O(n^k)
> exp = O(2^n)
> 
> Ordering:
> 
> Terms above are in increasing order.
> 
> Summation:
> 
> x + y = max(x, y)
> 
> Product:
> 
>        | 1       log      plog     sqrt  lin      linlog   poly  exp
> 
> -------+------------------------------------------------------------
> 1      | 1       log      plog     sqrt  lin      linlog   poly  exp
> 
> log    | log     plog     plog     ?     linlog   ?        poly  exp
> plog   | plog    plog     plog     ?     linplog  linplog  poly  exp
> sqrt   | sqrt    ?        ?        lin   poly     poly     poly  exp
> lin    | lin     linlog   linplog  poly  poly     poly     poly  exp
> linlog | linlog  linplog  linplog  poly  poly     poly     poly  exp
> poly   | poly    poly     poly     poly  poly     poly     poly  exp
> exp    | exp     exp      exp      exp   exp      exp      exp   exp
> 
> I've left a few question marks for the tricky cases. Ideas?
[...]

sqrt really belongs under poly, as far as asymptotic behaviour is
concerned. Basically, O(n^j) * O(n^k) for any real j, k > 0 is
asymptotically equal to O(n^(j+k)). Furthermore, O(O(n^j)^k) =
O(n^(j*k)). So you can treat polynomial complexities as a field of real
numbers, where + on the O(...) terms behaves like max(), * behaves like
+, and function composition behaves like ^.

Then exp behaves like an "infinite real", where exp > O(n^k) for all
real k>0. Its inverse, log, therefore, behaves like an "infinitesimal",
where O((log n)^k) for all real k>0 is less than O(n^k) for all real
k>0. (Yes, even O(n^(1/1000000)) will eventually outgrow O(log n).)

The log powers behave like "intermediate infinitesimals", where you have
O((log n)^j) < O((log n)^k) for all positive real j < k.

So these O-terms behave approximately like real numbers (poly) enhanced
with infinite quantities (exp and its derivations) and infinitesimal
quantities (log and its derivations), they follow the usual laws of
arithmetic. Therefore,

        O(n^k) * O(log n)

can be treated as the equivalent of a real number k + an infinitesimal
L, such that k < (k+L) < k+j for all real j>0. In other words,

        O(n) < O(n * log n) < O(n^(1+k))  for all k>0.

(Yes, even O(n^1.00000000001) will eventually outgrow O(n*log n). O(log
n) behaves like an infinitesimal.)

The nice thing about this is that you can simplify a lot of complicated
O(...) expressions just by applying algebra as described above on the
analogous {real + infinities + infinitesimals} system. Ordering
relations are preserved (for the most part -- this only breaks down with
pathological cases like O(sin n) which is unlikely to be ever
encountered). Also, function composition between poly and non-poly
complexities will generally be non-commutative, and mostly will break
the + = max analogy. (But it seems unlikely, in real-world algorithms,
to ever need to worry about the intricacies of exponential-time
algorithms, since generally they are to be avoided where possible.)

So you can get a (mostly) closed algebra just by mapping poly to the
real numbers, and then adding:

        L = infinitesimal quantity corresponding to O(log n)
        E = infinite quantity corresponding to exp

Various operations inside O(...) are thus mapped:

        + inside O(...) = max(...)
        * inside O(...) = + between mapped quantities
        O(f(g(n))) = O(f(n)) * O(g(n))
        O(1) = 0

Example: is O(n^3*log(n)) better than O(n^2*(log(n))^3)?
Answer:
        O(n^3*log(n)) maps to 3 + L, and O(n^2*(log(n))^3) maps to 2 +
        L^3. Since L^3 is infinitesimal, (2 + L^3) is very close to 2,
        whereas (3 + L) lies slightly above 3. Therefore O(n^3*log(n))
        is definitely faster-growing than O(n^2*(log(n))^3).

This algebra leads to various interesting questions like:

- Is there an intermediate complexity that lies between poly and exp?
  Yes: since exp is mapped to the infinite quantity E, it follows that
  E/2 is still an infinite quantity (even though it's strictly less than
  E). E/2 corresponds to E*1/2, which is the composition of exp with
  sqrt, so it follows that O(n^k) < O(e^sqrt(n)) < O(e^n) for all real
  k>0. (There are, of course, many other possibilities.)

- Similarly, the log powers O((log n)^k) are *always* slower-growing
  than any polynomial complexity, because L*k, being still
  infinitesimal, will always < j for all real j>0. So even
  O((log n)^10000) will not outgrow O(n^1.000000001). Multiplying with a
  poly function preserves this relationship:

        O(n^j * (log n)^k) --> j + L*k < j + m, for all real j,m>0, so
        O(n^j * (log n)^k) < O(n^(j+m)) for all j,m>0.

  Basically you're just displacing the mapped values by a constant.


T

-- 
People who are more than casually interested in computers should have at least 
some idea of what the underlying hardware is like. Otherwise the programs they 
write will be pretty weird. -- D. Knuth

Reply via email to