Hi,

i've a small suggestion to improve the behavior of the readfile() 
function.

First, since the command use the php streams, sometimes php crashes,
especially with big files (>10MB).

Also, if you output a file with readfile(), the memory consumption is
excacly the same as the reading file.

I using the readfile command to deliver files with a typical
content-disposition header, but since i have some 'big' files, i've
change the delivering to this small code:

<?php
$filename='megafile.txt';

$fh=fopen($filename, 'rb');
if ($fh!=false){
        while (!feof($fh) && !connection_aborted()) echo fread ($fh, 
8192);
        fclose($fh);
}
?>

On a 2 MB file, this script needs 35KB (test is with a php_memory log)
and the readfile version needs 2MB.

I think it is also possible to reduce the memory consume in the
readfile() command, because it is not required to read the complete
file first and than output it.

The problem is in the php_stream_passthru function. We need a
sapi_flush(TSRMLS_C) or php_end_ob_buffer(1, 1 TSRMLS_CC) call.

Not only the readfile command is affected, also the gzpassthru command.

Any comments?

Best regards,

Steve

-- 
PHP Development Mailing List <http://www.php.net/>
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to