On Tue, 2008-12-02 at 11:55 -0800, Brian Dunning wrote:
> I'm using a PHP cron job to constantly download files from a remote
> server. Client and server both have abundant unused bandwidth, and the
> sysads have already eliminated switches, interface cards, etc. as the
> source of the slowdown. I'm looking at the script to see why file
> downloads are taking so long and going so much slower than if I were
> to simply manually download them with a browser on the same machine.
> The script says:
>
> $ctx = stream_context_create(array('http' => array('timeout' =>
> 1200))); // 20 minutes per file
> $contents = file_get_contents($full_url, 0, $ctx);
> $fp = fopen('D:\\DocShare\\'.$filename, "w");
> $bytes_written = fwrite($fp, $contents);
> fclose($fp);
>
> Yes, it's on Windows. Any idea whether my PHP code might be
> introducing a slowdown? The files range from 500K to 50MB. I often
> launch multiple instances of the script but it doesn't seem to help
> much.
>
Instead of using PHP for this, why not have a look at WGET for Windows.
This is pretty much the standard way on *nix machines to grab files over
the Internet using the command line, and if the Windows version is half
as versatile as the Linux version, you'll find it has a lot of useful
features too, like support for dropped connections, etc.
Ash
www.ashleysheridan.co.uk
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php