On Wed, 25 Dec 2002 06:58:04 +0200 Octavian Rasnita <[EMAIL PROTECTED]> wrote:

> I've tried the following script for downloading a file and it works
> fine if the file size is not too big.
> 
> But if the file is very large, the computer gets out of memory.
> Can you tell me how to do it?

> Here is the part of my script:
> 
> my $ua = LWP::UserAgent -> new(env_proxy => 0,
> timeout => 50,
> keep_alive => 1,
> );
> 
> my $request = HTTP::Request -> new('GET', $location);
> my $response = $ua -> request($request);

A second argument to request() tells the agent where to send the
response content.  It can be either a filename or a callback
subroutine reference.

perldoc LWP::UserAgent .  Look for send_request and request.

> my $content = $response -> content;

You can save about 1/2 the memory by using $response->content directly
rather than copying it in another variable.

> #...
> open (OUT, ">$save_dir/$file") or die "Can't write to $save_dir/$file -
> $!";
> binmode OUT;
> print OUT $content;
> close OUT;

-- 
Mac :})
** I normally forward private questions to the appropriate mail list. **
Ask Smarter: http://www.tuxedo.org/~esr/faqs/smart-questions.html
Give a hobbit a fish and he eats fish for a day.
Give a hobbit a ring and he eats fish for an age.


Reply via email to