At 11:16 18.03.2003, Kevin Kaiser said:
--------------------[snip]--------------------
>       I have a simple php download script that streams a
> non-web-accessible file
>to the browser using this generally accepted method:
>
>         header("Expires: 0");
>         header("Cache-Control: private");
>         header("Content-Type: application/save");
>         header("Content-Length: ".filesize($total) );
>         header("Content-Disposition: attachment; filename=".$file_Path);
>         header("Content-Transfer-Encoding: binary");
>         $fh = fopen($total, "rb");
>         fpassthru($fh);
>
>       While it works, this is fairly useless in my situation where files
> range
>from 5mb to over 100mb, because due to fpassthru() or readfile() or whatever
>method used to stream the data to the recipient, the 'save as'/'open' dialog
>doesn't open until the entire file has been downloaded. This is very
>impractical since the user has no clue how much has been downloaded / how
>much longer is left, not to mention that on very large files (~75mb) apache
>will actually freeze up and become unresponsive to all users (sending cpu
>usage to 100%) for nearly 10 minutes or more, assumedly because it is still
>reading the file (regardless of whether the 'stop' button has been clicked
>or not).
>
>       With the last 2 lines commented out (fopen() and fpassthru()), the
> save-as
>dialog opens instantly.. is there any way fopen/fpassthru() could be delayed
>until after the user chooses to open or save the file ? How would you guys
>go about handling large file downloads while keeping the files themselves
>non-web-accessible (aka not a direct link/redirector)?
--------------------[snip]-------------------- 

Disclaimer - this is just an idea, I've never dealt with downloading that
big files.

If apache freezes because the file it's reading is too big to be handled
I'd suggest to try a chunked approach, not using fpassthrough or a single
file read. The reason is that for every system I/O the operating systems
get a chance to switch process context, and to allow cooperative multitasking.

Would go something like that:

    define('CHUNKSIZE', 8192);
    $fp = fopen($file, 'r') or die ("Cannot open $file");

    header("Expires: 0");
    header("Cache-Control: private");
    header("Content-Type: application/save");
    header("Content-Length: ".filesize($total) );
    header("Content-Disposition: attachment; filename=".$file_Path);
    header("Content-Transfer-Encoding: binary");
   // --
    while ($buffer = fread($fp, CHUNKSIZE)) {
        echo $buffer;
    }
    fclose($fp);

You may need to experiment with the size of the chunks to keep up an
acceptable transfer rate. If you still encounter big holdups for other
processes you might consider using usleep() every 10th chunk or so, but use
your calculator to check how that would extend the overall transmission
time. For example, for a 100MB file, going to usleep for 50 msec after each
9kb chunk would mean that your process will sleep for 1.82 hours (!!) until
the file is delivered...


-- 
   >O     Ernest E. Vogelsinger
   (\)    ICQ #13394035
    ^     http://www.vogelsinger.at/



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to