Re: 1.9.1 large file fetch support
unfortunately, wget 1.9.1 does not have long file support. this feature has been added recently to wget 1.10, which i am going to release tonight or tomorrow morning at most. Thanks for the prompt response! I checked out the beta from cvs and managed to build it; it works. BTW it would be nice to add build support for cygwin. I had to run "make -f Makefile.CVS" and then copy over to a cygwin installation to build.
Re: Files over 2GB
On Tuesday 07 June 2005 01:13 am, Sübülünk Smürf wrote: > It can't download a 4GB sized file > Over 2GB (2,147,483,647byte) it said: "File size limit exceeded" long file support has been introduced in wget 1.10, which i am uploading on gnu.org right now (it might take a while to comply to all the requirements of the FSF automated upload procedure, though). -- Aequam memento rebus in arduis servare mentem... Mauro Tortonesi http://www.tortonesi.com University of Ferrara - Dept. of Eng.http://www.ing.unife.it Institute for Human & Machine Cognition http://www.ihmc.us GNU Wget - HTTP/FTP file retrieval tool http://www.gnu.org/software/wget Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net Ferrara Linux User Group http://www.ferrara.linux.it
Re: restrict-file-names mode and colons in URLs
Herb Schilling nasa.gov> writes: > When I set the restrict-file-names mode to windows, the filenames and > the links to these files look like this: > > photoalbum_photo_view b_start%3Aint=0 That's a bug, the link should have "%25" in place of "%". This is fixed in Wget 1.10, which has just been released and hasn't reached FTP sites yet. Until then, you can get it from: ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-rc1.tar.bz2
Feature request: download failure HTTP documents
I would be happy to see a feature for wget proceed to download and store the document sent by the HTTP server, whether or not the status code isn't 20x. Sometimes, even when an error status is received, the response body contains some valuable information. I'm especially interested in getting "500 Internal Server Error" stack traces from Tomcat, but this could also be useful for e.g. fetching fancy "404 Not Found" pages. This behavior, if implemented, would obviously have to be optional (default: disabled). -- +--+ | Paweł Sakowski <[EMAIL PROTECTED]>Never trust a man | |who can count up to 1023 on his fingers. | +--+
Files over 2GB
It can't download a 4GB sized file Over 2GB (2,147,483,647byte) it said: "File size limit exceeded"