Terry Reedy wrote:

> If I understand correctly, you want to multiiply each of m numbers by each 
> of n numbers, giving m*n products.  That is O(m*n) work.  Inserting (and 
> extracting) each of these is a constant size m priority cue takes, I 
> believe, O(log(m)) work, for a total of m*n*log(m).  That is faster than 
> O(m*n*log(m*n)) for sorting m*n random numbers.

Probably, I'm not very good with big O computations. Please forget my 
earlier post and please also ignore the unfortunate subject line. I want 
the cartesian product of the lists but I want the sums of the items. 
Suppose the function to combine the items is

     def f(i,j):
         return i+j

And the list we want to sort would be made like this:

LR = [f(i,j) for i in L for j in R]

That would be like in my original post. However if the function would 
have been:

     def g(i,j):
         return n*i+j

The resulting cartesian product of the list would be sorted a lot 
quicker, especially if the two lists -L and R- we start with are sorted. 
(n is the length of both lists here)

So if changing the way the items are combined influences sorting time, 
is there also a way to optimize the order of generating the items for 
later sorting.

I mean optimized for a specific combining function, in this case 
function f.

> I don't know how you would sort by hashing.

Me too. I also learned that hashing is O(1) for non-mathematicians.

Probably I'm suffering from a mild outbreak of spring. I'm currently 
trying to stop myself from starting to smoke again or writing critical 
posts about PyPy, if it explains anything.

A.

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to