On 3/26/07, Issac Goldstand <[EMAIL PROTECTED]> wrote:
If the images are already on the server, and needed for the response
immediately, you'd need to do it inline, but you could still make life
easier on yourself (somewhat) by caching the reduced images to avoid
reprocessing.

I could give more specific advice if you could share a bit more about
what you're trying to accomplish in general.

Issac,

It is beyond simple.  There are a bunch a full size image (4 MB to 16
MB) on the web server that need to be indexed (210x140 size) on the
browser, when the user clicks on a thumbnail, they need to get a
larger (900x600 size) image to view.  They are getting any where from
12 to 96 of the indexed images per page.

Like I said above, I fully understand the importance of caching the
results.  Rather then trying to write the code myself and save the
images myself, I am using a second instance of apache running as a
reverse proxy to cache the results;)  Why reinvent the wheel;)

On thought I had was this:

When the server gets the request for the image, if it does need to
downsize it, add it to a queue and simply make the system wait for the
response.  I would actually have to create some type of pool of
processes so that things don't get too back logged.  I am thinking
maybe based on IP address or something so that there is only one
downsizing per browser.  The only issue I have there is what happens
if two browsers request the same image, I am sure there is a way that
once the image is downsized once, it could be sent to both browsers,
but this is starting to get really complex.  I am really looking for
simplicity.  Speed is important, but this is really only a very small
piece of what I am doing.

Right now, the worst and very much extreme case is 100 viewing
stations (browsers) having access.  Normal is going to be between 4
and 20 viewing stations.

Sam

Reply via email to