Re: [PHP] Slow file download
On Tue, 2008-12-02 at 11:55 -0800, Brian Dunning wrote: I'm using a PHP cron job to constantly download files from a remote server. Client and server both have abundant unused bandwidth, and the sysads have already eliminated switches, interface cards, etc. as the source of the slowdown. I'm looking at the script to see why file downloads are taking so long and going so much slower than if I were to simply manually download them with a browser on the same machine. The script says: $ctx = stream_context_create(array('http' = array('timeout' = 1200))); // 20 minutes per file $contents = file_get_contents($full_url, 0, $ctx); $fp = fopen('D:\\DocShare\\'.$filename, w); $bytes_written = fwrite($fp, $contents); fclose($fp); Yes, it's on Windows. Any idea whether my PHP code might be introducing a slowdown? The files range from 500K to 50MB. I often launch multiple instances of the script but it doesn't seem to help much. Instead of using PHP for this, why not have a look at WGET for Windows. This is pretty much the standard way on *nix machines to grab files over the Internet using the command line, and if the Windows version is half as versatile as the Linux version, you'll find it has a lot of useful features too, like support for dropped connections, etc. Ash www.ashleysheridan.co.uk -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Slow file download
I'm open to something like that - we're in the middle of the holiday crunch and can't afford any downtime, so a significant change is out of the question. This is part of much larger and more involved scripting, so it would need to be a plug-n-play replacement and also be able to return information to the script calling it - bytes, success or failure. We're grabbing filenames and credentials out of MySQL, marking them in progress, attempting the download, and then updating the MySQL record with the results. On Dec 2, 2008, at 12:04 PM, Ashley Sheridan wrote: Instead of using PHP for this, why not have a look at WGET for Windows. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Slow file download
On Tue, 2008-12-02 at 12:14 -0800, Brian Dunning wrote: I'm open to something like that - we're in the middle of the holiday crunch and can't afford any downtime, so a significant change is out of the question. This is part of much larger and more involved scripting, so it would need to be a plug-n-play replacement and also be able to return information to the script calling it - bytes, success or failure. We're grabbing filenames and credentials out of MySQL, marking them in progress, attempting the download, and then updating the MySQL record with the results. On Dec 2, 2008, at 12:04 PM, Ashley Sheridan wrote: Instead of using PHP for this, why not have a look at WGET for Windows. Well you could always replace your CURL request with an exec() call to WGET, which will be able to return the HTTP request codes (200 for success, 404 for file not found, etc) and other information. Ash www.ashleysheridan.co.uk -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Slow file download
If the files are LARGE, file_get_contents is a Bad Idea (tm). You're trying to suck the whole thing into RAM, which it can't, which swaps and thrashes the bleep out of your RAM/swap space... Use fopen and an fread loop instead, and you'll probably see much better performance. Also, consider going old school and getting rid of the stream_context stuff. It's new and untested :-) You can use ini_set and the parameter or even fall back to fsockopen with a timeout. Note that those time-outs are for any given packet to arrive (or the socket to open) not the whole enchilada to download. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php