Re: Wget and large files: Solution available

2004-07-21 Thread Leonid Petrov
Maik,

  As you could see from 
http://www.mail-archive.com/wget%40sunsite.dk/msg06652.html and other
postings, this problem will be fixed in the next release of the official
version. If you need to download files larger than 2Gb now, get
patched, unofficial version of wget from 
http://software.lpetrov.net/wget-LFS/

Leonid


Wget and large files: Solution available

2004-07-21 Thread Maik Hassel
Hello everybody,
I am using wget for automated database updates, but my files are now by 
far exceeding 2GB of size. As far as I could find out, wget has still 
problems with this file size limitation, right? At least the latest 
version I downloaded still does... Is there a solution foreseen in the 
near future?

Are there any alternative programs to wget that I could use that do not 
suffer from this limitation???

Please include me personally in all replys, since I am not a subscriber 
of this list!

Thanks a lot for any comments
Maik



Re: wget 1.8.2: assertion failed

2004-07-21 Thread Hrvoje Niksic
Eric Domenjoud <[EMAIL PROTECTED]> writes:

> Under Mandrake linux 9.2, the command
>
>   wget -r -k http://www.cplusplus.com/ref/iostream
>
> terminates with
>
>   get: retr.c:263: calc_rate: Assertion `msecs >= 0' failed.
>   Aborted

This problem has been fixed in 1.9.1.



Re: Cookies with wget

2004-07-21 Thread Hrvoje Niksic
RĂ¼diger Cordes <[EMAIL PROTECTED]> writes:

> there is no description how to turn on cookie storing nor how to use
> the command line to tell wget to use two cookies.

Do you have access to the info manual?  It does describe the options
`--load-cookies' and `--save-cookies', which are relevant for storing
and saving cookies.



Re: wget file size overflow?

2004-07-21 Thread Hrvoje Niksic
That's a bug in all released versions of Wget, sorry.  In the next
release downloading files larger than 2G might become possible.



Re: extremely slow download rate!

2004-07-21 Thread Hrvoje Niksic
"Josy P. Pullockara" <[EMAIL PROTECTED]> writes:

> I use "GNU Wget 1.8.2" for downloading and "Galeon 1.3.8" for web
> browsing on "Mandrake 9.2".
>
> We have 10Mbps line via an http proxy and I used to use wget for robust
> and fast downloading of huge files from ftp sites at almost 150 K/s.
>
> Now I find wget extremely slow. Transfer rates of "10 B/s" when Galeon
> is able to download the same file from the same ftp site at "40
> K/s".

This is atypical.  Normally Wget is as fast, or slightly faster, than
GUI browsers.

> I do not have any custom .wgetrc file. I have used only the default
> settings, except for setting up the http_proxy and ftp_proxy
> environment variables.
>
> How do I troubleshoot this?

Please check that Galeon is using the same proxy that you are using.
If you understand Unix system calls, you can also use `strace'
(`truss') to check that Wget is doing what you think is doing.

Also check /usr/local/etc/wgetrc and /etc/wgetrc to see if the sneaky
sysadmin has perhaps installed a bandwidth limitation from under you.



Re: parallel fetching

2004-07-21 Thread Hrvoje Niksic
Dan Jacobson <[EMAIL PROTECTED]> writes:

> Phil> How about
> Phil> $ wget URI1 & wget URI2
>
> Mmm, OK, but unwieldy if many. I guess I'm thinking about e.g.,
> $ wget --max-parallel-fetches=11 -i url-list
> (hmm, with default=1 meaning not parallel, but sequential.)

I suppose forking would not be too hard, but dealing with output from
forked processes might be tricky.  Also, people would expect `-r' to
"parallelize" as well, which would be harder yet.



Re: parallel fetching

2004-07-21 Thread Hrvoje Niksic
Dan Jacobson <[EMAIL PROTECTED]> writes:

> Maybe add an option so e.g.,
> $ wget --parallel URI1 URI2 ...
> would get them at the same time instead of in turn.

You can always invoke Wget in parallel by using something like `wget
URI1 & wget URI2 &'.  How would a `--parallel' option be different
from that?



wget 1.8.2: assertion failed

2004-07-21 Thread Eric Domenjoud
Under Mandrake linux 9.2, the command
wget -r -k http://www.cplusplus.com/ref/iostream
terminates with
get: retr.c:263: calc_rate: Assertion `msecs >= 0' failed.
Aborted
Regards
Eric Domenjoud