Curt Zirzow wrote:
* Thus wrote Richard Lynch:
> Sebastian wrote:
> > i'm working on a app which output files with readfile() and some headers..
> > i read a comment in the manual that says if your outputting a file
> > php will use the same amount of memory as the size of the file. so,
> > if the file is 100MB php will use 100MB of memory.. is this true?
>
> I don't know if it's STILL true (or ever was) that readfile() would
> suck the whole file into RAM before spitting it out... Seems real
> unlikely, but...


Never was and still isn't.

using either readfile or fpassthru is the best route.

All I know that I am hosting a GForge site, and if I leave the download.php code as is, I will send up with apache processes that are 200+Meg. (the size of my download files).
http://gforge.org/plugins/scmcvs/cvsweb.php/gforge/www/frs/download.php?rev=1.6;content-type=text%2Fplain;cvsroot=cvsroot%2Fgforge


(which uses readfile)

I have tried fpassthru - same thing.

I have even tried:

$fp = fopen($sys_upload_dir.$group_name.'/'.$filename,'rb');
while (!feof($fp)) {
   $buff = fread($fp, 4096);
   print $buff;
}
fclose ($fp);

and I get the same thing. The only thing that seems to work is:

Header("Location: ".$html_pointer_to_fp);

which lets apache do the downloading.

I would do a apache_child_terminate, but the function does not seem to be available to me (see my previous question about this).

Any thoughts, or suggestions, I am open to try.

My next experiment is:
============================
var $buff;
while (!feof($fp)) {
$buff = fread($fp, 4096);
print $buff;
}
unset($buff);
fclose ($fp);
============================
Hopefully that will make sure that the var $buff is only created once, and that the memory is cleared after the function is done.


-Robin

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Reply via email to