Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Curt Zirzow
* Thus wrote Robin Getz:
> Curt Zirzow [EMAIL PROTECTED] wrote:
> >> replaced:
> >>   readfile($name);
> >> with:
> >>   $fp = fopen($name, 'rb');
> >>   fpassthru($fp);
> >
> >The only difference between readfile() and fpassthru() is what parameters 
> >you pass it.
> >
> >Something else is the problem, what version of php are you running?
> 
> I am using php 4.2.2

Thats an awful old version.

> 
> OK - I lied.
> 
> The same problem exists with fpassthru (now that I have let it run a little 
> longer) I now have 5 sleeping httpd processes on my system that are 
> consuming 200Meg each.

Either upgrade to a newer version or you could use this function:

  http://php.net/apache-child-terminate

Curt
-- 
Quoth the Raven, "Nevermore."

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
OK, I checked things out, and based on some private emails, and pointers, 
from Francisco M. Marzoa [EMAIL PROTECTED], I have now

replaced:
   readfile($name);
with:
while(!feof($fp)) {
$buf = fread($fp, 4096);
echo $buf;
$bytesSent+=strlen($buf);/* We know how many bytes were sent 
to the user */
}

I restarted apache (to free all the memory), and we will see how it goes 
overnight.

-Robin
BTW - output buffering is turned OFF. ob_get_level() returns 0, but both 
functions readfile and fpassthru seem to allocate memory (and never 
release) the size of the file.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Klaus Reimer
Robin Getz wrote:
The same problem exists with fpassthru (now that I have let it run a 
little longer) I now have 5 sleeping httpd processes on my system that 
are consuming 200Meg each.
Any thoughts?
Ok, so much for the theory. What about the output buffering? Have you 
checked if you have output buffering enabled? What das ob_get_level() 
return?

If it's activated but you can't find where it is activated then this may 
help to disable it in your script:

while (ob_get_level()) ob_end_flush();
--
Bye, K  (FidoNet: 2:240/2188.18)
[A735 47EC D87B 1F15 C1E9  53D3 AA03 6173 A723 E391]
(Finger [EMAIL PROTECTED] to get public key)


signature.asc
Description: OpenPGP digital signature


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Curt Zirzow [EMAIL PROTECTED] wrote:
> replaced:
>   readfile($name);
> with:
>   $fp = fopen($name, 'rb');
>   fpassthru($fp);
The only difference between readfile() and fpassthru() is what parameters 
you pass it.

Something else is the problem, what version of php are you running?
I am using php 4.2.2
OK - I lied.
The same problem exists with fpassthru (now that I have let it run a little 
longer) I now have 5 sleeping httpd processes on my system that are 
consuming 200Meg each.

Any thoughts? 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Curt Zirzow
* Thus wrote Robin Getz:
> Klaus Reimer [EMAIL PROTECTED] wrote:
> >If this theory is true, you may try fpassthru().
> 
> replaced:
>   readfile($name);
> with:
>   $fp = fopen($name, 'rb');
>   fpassthru($fp);

The only difference between readfile() and fpassthru() is what
parameters you pass it.

Something else is the problem, what version of php are you running?


Curt
-- 
Quoth the Raven, "Nevermore."

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Greg Donald
On Thu, 04 Nov 2004 08:22:18 -0800, Robin Getz
<[EMAIL PROTECTED]> wrote:
> and now I don't loose 250 Meg of memory every time I download a 250Meg
> file. If someone wants to add this to the  readfile() php manual - great.

Anyone can post user comments in the manual.  Give it a shot.


-- 
Greg Donald
Zend Certified Engineer
http://gdconsultants.com/
http://destiney.com/

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Klaus Reimer [EMAIL PROTECTED] wrote:
If this theory is true, you may try fpassthru().
replaced:
  readfile($name);
with:
  $fp = fopen($name, 'rb');
  fpassthru($fp);
and now I don't loose 250 Meg of memory every time I download a 250Meg 
file. If someone wants to add this to the  readfile() php manual - great.

Thanks
Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Klaus Reimer
Robin Getz wrote:
The issue is that readfile writes it to the output buffer before sending 
it to the client. 
Are you sure you HAVE output buffering? What does ob_get_level() return? 
If it returns 0 then you don't have output buffering.

My theory (and it's only a theory) is, that readfile may not be the 
optimal function for this task. Maybe it's implemented to read the whole 
file into memory and then output the data, don't know.

If this theory is true, you may try fpassthru(). The name of this 
function sounds like the file is just "passed through" which sounds like 
the task you want to perform :-)

The example on php.net of this function is exactly matching your task:

// open the file in a binary mode
$name = ".\public\dev\img\ok.png";
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
?>
--
Bye, K  (FidoNet: 2:240/2188.18)
[A735 47EC D87B 1F15 C1E9  53D3 AA03 6173 A723 E391]
(Finger [EMAIL PROTECTED] to get public key)


signature.asc
Description: OpenPGP digital signature


[PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Hi.
I have searched a few of the mailing lists, and have not found an answer.
I am working on a site that is currently running gforge ( 
http://gforge.org/ ). The process that is used to download files from the 
file repository is something like:

Header('Content-disposition: filename="'.str_replace('"', '', $filename).'"');
Header("Content-type: application/binary");
$length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header("Content-length: $length");
readfile($sys_upload_dir.$group_name.'/'.$filename);
The issue is that readfile writes it to the output buffer before sending it 
to the client. When several people try to download large files at the same 
time (The Ant Download Manager - trys downloading things by opening 20 
connections). 20 x a single 250Meg file rips through physical and swap 
pretty fast and crashes my machine.

Any thoughts on how to turn output buffering off? I have tried, but have 
not been able to get it working properly.


On a similar note, is there a portable way to determine available system 
memory (physical and swap)? Right now I am using something like:
=
# ensure there is enough free memory for the download
$free = shell_exec('free -b'); $i=0; while ( $i != strlen($free) ) {
	$i = strlen($free);
	$free = str_replace('  ',' ',$free);	
}
$free = str_replace("\n",'',$free);
$freeArray = explode(' ',$free);
$total_free = $freeArray[9] + $freeArray[18];
==

Calling shell_exec isn't very portable to other systems.
Thanks in advance.
-Robin
 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php