On 11/18/10 6:53 PM, André Warnier wrote:
I'd also like to avoid the last resort which would be to run a long process to
process each file, save them to a temporary directory, and then re-read them
>
Why is that "the last resort" ?
It seems to me to be the logical way of achieving what you want.
Anyway, when the user is posting the three files (as 3 "file" boxes in a form),
this is sent by the browser as one long stream

Sorry, maybe I wasn't clear: the client isn't sending the files, they are being read on the server locally and served to the client in a single output stream. The client is simply providing the FILENAMES in the request; the server has to do all the work, and already has all the data.

My point there was to say that if the client says "send file1+file2+file3" then I want the server to be able to start outputting the data for file1 immediately before file2 has even been read from disk. But in addition it needs to pass through an output filter, and each will pass through that same filter, but the filter needs to know when it hits an EOF (as opposed to an EOS).

I thought a simple solution would be that in a sub-request, the filter only processes one file so it's one stream, but then I have to combine sub-requests at the top level and I don't know how to do that. But ideally I want something like this:

 for my $file (@requested_files) {
    my $header = generate_header($file);
    my $footer = generate_footer($file);
    $r->write($header);
    my $subr = $r->lookup_uri("/fetch?name=$file", $myfilter);
    $r->write(...output of $subr...?);
    $r->write($footer);
 }

It's a little more complicated than that, but hopefully you get the idea...


Brian

Reply via email to