Without knowing what you are trying to do I get the impression you might be
better off doing this via SQL rather than PHP.  You say:

> ...very large result sets, which are transferred to arrays,
> and that does some extremely heavy lifting in terms of calculations on those
> arrays. By design, it iterates through each possible combination of two
> result sets, and does some calculations on those results.

Why not do the heavy lifting in the DB?  Itıs very easy to get the Cartesian
Product (for each row of set A combine it with each row of set B) inside of
a DB and then to apply calculations which would pass the row on to the final
results set or drop it.  Databases are optimized for this already.

It may be that your calculations are too much to do inside the DB but
remember that the join need not be "primary key = foreign key"; you have
considerably more flexibility here.

Beyond that you can, perhaps, do some or all of the calculations ahead of
time.  The DB has space to record the results and you can update these
results either continuously (on idle type or thing) or as a record which was
involved in the calculations changes (either figure it out on the spot of
simply mark its children invalid and come back later).

This can be particularly useful when you want to put up a results page
quickly.

Just a thought.

Frank


On 3/4/02 5:49 PM, "[EMAIL PROTECTED]"
<[EMAIL PROTECTED]> wrote:

> From: "Aron Pilhofer" <[EMAIL PROTECTED]>
> Reply-To: "Aron Pilhofer" <[EMAIL PROTECTED]>
> Date: Mon, 4 Mar 2002 11:02:18 -0500
> To: [EMAIL PROTECTED]
> Subject: optimization (another tack)
> 
> Let me try this again more generally. I am trying to optimize a function in
> PHP that handles very large result sets, which are transferred to arrays,
> and that does some extremely heavy lifting in terms of calculations on those
> arrays. By design, it iterates through each possible combination of two
> result sets, and does some calculations on those results. As you can
> imagine, the numbers get quite large, quite fast; sets of 500 by 1000
> necessitate a half-million calculations.
> 
> So, short of rewriting this function in C, which I cannot do, are there any
> suggestions for optimizing. For example:
> 
> 1) is there any advantage to caching an array as a local file?
> 2) the script pumps the results of the calculations into a new table.. would
> it be faster to dump it into a local file instead?
> 3) is there any advantage to executing the script as a CGI? (does that make
> sense? I don't know if I know the correct jargon here...)
> 
> Any other tips folks have for scripts that handle a lot of calculations
> would be greatly appreciated.
> 
> Thanks in advance.
> 



--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to