> Now that I think about it, though, I can lift 1 lb weight 5 feet 
> in 10 seconds, and I can do the same with a 100 lb weight; but
> even though the time is the same, I wouldn't say the problems 
> are "equally difficult".  I wonder if there's an equivalent 
> measure for "computational work".

There is an analogous situation in computational complexity.
The RAM (random access machine) model has memory
registers that can hold integers of arbitrary size and 12
instructions.  It is equivalent to a Turing machine.  
(The RAM model is described in Aho, Hopcroft, and Ulllman, 
"The Design and Analysis of Computer Algorithms", 
Addison-Wesley, 1974.)

"Time" for a program is the number of machine instructions
executed.  But if you stop there then you get a biased
measure, equivalent to equating lifting 1lb 5 feet in 10 seconds
and lifting 100 lbs 5 feet in 10 seconds.  Instead, "time"
on a RAM instruction includes a "logarithmic criterion",
basically the length in bits (or length in some finite base)
of the contents of a memory register.  Now you can not
use the arbitrarily large integers to encode information
and get times that are the same as doing 1+2.  You can
still do that kind of encoding but it would come with
a realistic cost.

As an aside, sometimes it is easy for even experts to
forget that time complexity should be based on the
length of the input.  Once, I took a seminar course at 
the U of T with Stephen Cook as the instructor. As is
usually the case several other professors sat in.
In one class we were categorizing algorithms according
to their time complexity, and someone put the "school
method" for multiplying two n-by-n matrices into the
O(n^3) category.  Cook pointed out that school method
matrix multiplication should be O(n^1.5) because the
convention is that n is the length of the input.
(The more long winded argument is that the length of 
the input is O(m) where m=n^2 and the time complexity 
of matrixmult is O(m^2).  With a change of variable to 
use n as usual to indicate the length of the input, 
the complexity of matrixmult is O(n^1.5).)

There are other, more practical, implications of this,
and the J interpreter exploits some of these implications.



----- Original Message -----
From: Dan Bron <[EMAIL PROTECTED]>
Date: Saturday, July 12, 2008 7:38
Subject: RE: [Jchat] Tenure
To: 'Chat forum' <[email protected]>

> Roger posted:
> >  http://www.cbi.umn.edu/oh/pdf.phtml?id=317
> 
> 
> This is an interesting article; here's a passage that really 
> highlights how the insights of the past become the common sense of
> the present:
> 
>       Frana: The idea of time being the most important complexity 
>                measure seems rather straightforward to me now 
>                because I've heard it and read it several places, but 
>                it apparently wasn't.
> 
> 
>          Cook: I think 
> time was an important measure. It was Alan 
>                Cobham who was trying to think of some intrinsic 
>                measure like "work," but in fact his theorem was about 
>                the characterization of polynomial time, so that was
>                the thing he talked about-time. Time seemed to be 
>                the most obvious measure of complexity. Certainly space
>                memory was also considered right from the start. 
> 
> Before reading this, it never even occurred to me that there 
> could be other measures of complexity than "time to solve the
> problem".  It was "obvious".
> 
> Now that I think about it, though, I can lift 1 lb weight 5 feet 
> in 10 seconds, and I can do the same with a 100 lb weight; but
> even though the time is the same, I wouldn't say the problems 
> are "equally difficult".  I wonder if there's an equivalent 
> measurefor "computational work".
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to