Re: WGET bug...

2008-07-11 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 HARPREET SAWHNEY wrote: > Hi, > > Thanks for the prompt response. > > I am using > > GNU Wget 1.10.2 > > I tried a few things on your suggestion but the problem remains. > > 1. I exported the cookies file in Internet Explorer and specified > that

Re: WGET bug...

2008-07-11 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 HARPREET SAWHNEY wrote: > Hi, > > I am getting a strange bug when I use wget to download a binary file > from a URL versus when I manually download. > > The attached ZIP file contains two files: > > 05.upc --- manually downloaded > dum.upc--

Re: [fwd] Wget Bug: recursive get from ftp with a port in the url fails

2007-09-17 Thread Micah Cowan
Hrvoje Niksic wrote: > Subject: > Re: Wget Bug: recursive get from ftp with a port in the url fails > From: > baalchina <[EMAIL PROTECTED]> > Date: > Mon, 17 Sep 2007 19:56:20 +0800 > To: > [EMAIL PROTECTED] > > To: > [EMAIL PROTECTED] > > Message-ID

[fwd] Wget Bug: recursive get from ftp with a port in the url fails

2007-09-17 Thread Hrvoje Niksic
--- Begin Message --- Hi,I am using wget 1.10.2 in Windows 2003.And the same problem like Cantara. The file system is NTFS. Well I find my problem is, I wrote the command in schedule tasks like this: wget -N -i D:\virus.update\scripts\kavurl.txt -r -nH -P d:\virus.update\kaspersky well, after "w

Re: wget bug?

2007-07-09 Thread Matthias Vill
Mauro Tortonesi schrieb: On Mon, 9 Jul 2007 15:06:52 +1200 [EMAIL PROTECTED] wrote: wget under win2000/win XP I get "No such file or directory" error messages when using the follwing command line. wget -s --save-headers "http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1&class=Arc"; %1 = 21

Re: wget bug?

2007-07-09 Thread Mauro Tortonesi
On Mon, 9 Jul 2007 15:06:52 +1200 [EMAIL PROTECTED] wrote: > wget under win2000/win XP > I get "No such file or directory" error messages when using the follwing > command line. > > wget -s --save-headers > "http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1&class=Arc"; > > %1 = 212BI > Any ide

wget bug?

2007-07-08 Thread Nikolaus_Hermanspahn
wget under win2000/win XP I get "No such file or directory" error messages when using the follwing command line. wget -s --save-headers "http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1&class=Arc"; %1 = 212BI Any ideas? thank you Dr Nikolaus Hermanspahn Advisor (Science) National Radiation L

RE: wget bug

2007-05-24 Thread Tony Lewis
Highlord Ares wrote: > it tries to download web pages named similar to > http://site.com?variable=yes&mode=awesome Since "&" is a reserved character in many command shells, you need to quote the URL on the command line: wget "

RE: wget bug

2007-05-23 Thread Willener, Pat
EMAIL PROTECTED] On Behalf Of Highlord Ares Sent: Thursday, May 24, 2007 11:41 To: [EMAIL PROTECTED] Subject: wget bug when I run wget on a certain sites, it tries to download web pages named similar to http://site.com?variable=yes&mode=awesome. However, wget isn't saving any of these f

wget bug

2007-05-23 Thread Highlord Ares
when I run wget on a certain sites, it tries to download web pages named similar to http://site.com?variable=yes&mode=awesome. However, wget isn't saving any of these files, no doubt because of some file naming issue? this problem exists in both the Windows & unix versions. hope this helps

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
Hi, I am trying to download a Wiki category for off-line browsing, and am using a command-line like this: wget http://wiki/Category:Fish -r -l 1 -k Wiki categories contain colons in their filenames, for example: Category:Fish If I request that wget convert absolute paths to relative links, th

WGet Bug: Local URLs containing colons do not work

2006-12-10 Thread Peter Fletcher
Hi, I am trying to download a Wiki category for off-line browsing, and am using a command-line like this: wget http://wiki/Category:Fish -r -l 1 -k Wiki categories contain colons in their filenames, for example: Category:Fish If I request that wget convert absolute paths to relative links, th

Re: wget bug in finding files after disconnect

2006-11-18 Thread Georg Schulte Althoff
Paul Bickerstaff <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]: > I'm using wget version "GNU Wget 1.10.2 (Red Hat modified)" on a fedora > core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b > on a WinXP laptop. Both display the same faulty behaviour which I don't > bel

wget bug in finding files after disconnect

2006-11-15 Thread Paul Bickerstaff
I'm using wget version "GNU Wget 1.10.2 (Red Hat modified)" on a fedora core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b on a WinXP laptop. Both display the same faulty behaviour which I don't believe was present in earlier versions of wget that I've used. When the internet

wget bug

2006-11-01 Thread lord maximus
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.

Re: new wget bug when doing incremental backup of very large site

2006-10-21 Thread Steven M. Schweda
>From dev: > I checked and the .wgetrc file has "continue=on". Is there any way to > surpress the sending of getting by byte range? I will read through the > email and see if I can gather some more information that may be needed. Remove "continue=on" from ".wgetrc"? Consider: -N, --tim

Re: new wget bug when doing incremental backup of very large site

2006-10-18 Thread dev
I checked and the .wgetrc file has "continue=on". Is there any way to surpress the sending of getting by byte range? I will read through the email and see if I can gather some more information that may be needed. thanks Steven M. Schweda wrote: 1. It would help to know the wget version

Re: new wget bug when doing incremental backup of very large site

2006-10-15 Thread Steven M. Schweda
1. It would help to know the wget version ("wget -V"). 2. It might help to see some output when you add "-d" to the wget command line. (One existing file should be enough.) It's not immediately clear whose fault the 416 error is. It might also help to know which Web server is running on t

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with 100

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with 100

new wget bug when doing incremental backup of very large site

2006-10-15 Thread dev
I was running wget to test mirroring an internal development site, and using large database dumps (binary format) as part of the content to provide me with a large number of binary files for the test. For the test I wanted to see if wget would run and download a quantity of 500K files with 100

Re: [WGET BUG] - Can not retreive image from cacti

2006-06-19 Thread Steven M. Schweda
>From Thomas GRIMONET: > [...] > File is created but it is empty. That's normal with "-O" if Wget fails for some reason. It might help the diagnosis to see the actual Wget command instead of the code which generates the Wget commsnd. If that doesn't show you anything, then adding "-d" to

[WGET BUG] - Can not retreive image from cacti

2006-06-19 Thread Thomas GRIMONET
Hello, We are using version 1.10.2 of wget under Ubuntu and Debian. So we have many scripts that get some images from a cacti site. These scripts ran perfectly with version 1.9 of wget but they can not get image with version 1.10.2 of wget. Here you can find an example of our scripts:

Re: Wget Bug: recursive get from ftp with a port in the url fails

2006-04-13 Thread Hrvoje Niksic
"Jesse Cantara" <[EMAIL PROTECTED]> writes: > A quick resolution to the problem is to use the "-nH" command line > argument, so that wget doesn't attempt to create that particular > directory. It appears as if the problem is with the creation of a > directory with a ':' in the name, which I cannot

Wget Bug: recursive get from ftp with a port in the url fails

2006-04-12 Thread Jesse Cantara
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is: "wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*"Where "Directory" contains multiple subdirectories, a

wget bug: doesn't CWD after ftp failure

2006-03-05 Thread Nate Eldredge
Hi folks, I think I have found a bug in wget where it fails to change the working directory when retrying a failed ftp transaction. This is wget 1.10.2 on FreeBSD-6.0/amd64. I was trying to use wget to get files from a broken ftp server which occasionally sends garbled responses, causing wg

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Steven M. Schweda
From: Hrvoje Niksic > [...] On Unix-like FTP servers, the two methods would > be equivalent. Right. So I resisted temptation, and kept the two-step CWD method in my code for only a VMS FTP server. My hope was that some one would look at the method, say "That's a good idea", and change the "

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Hrvoje Niksic
[EMAIL PROTECTED] (Steven M. Schweda) writes: >> and adding it fixed many problems with FTP servers that log you in >> a non-/ working directory. > > Which of those problems would _not_ be fixed by my two-step CWD for > a relative path? That is: [...] That should work too. On Unix-like FTP serv

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic > Prepending is already there, Yes, it certainly is, which is why I had to disable it in my code for VMS FTP servers. > and adding it fixed many problems with > FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my t

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Daniel Stenberg <[EMAIL PROTECTED]> writes: > On Fri, 25 Nov 2005, Steven M. Schweda wrote: > >> Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to >> those paths. > > I agree. What good would prepending do? Prepending is already there, and adding it fixed many problems with FTP

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Daniel Stenberg
On Fri, 25 Nov 2005, Steven M. Schweda wrote: Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. I agree. What good would prepending do? It will most definately add problems such as those Steven describes. -- -=- Daniel Stenberg -=- http://daniel.haxx

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic > Also don't [forget to] prepend the necessary [...] $CWD > to those paths. Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. As you might recall from my changes for VMS FTP servers (if you had ever looked at them), this scheme causes no en

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Hrvoje Niksic <[EMAIL PROTECTED]> writes: > That might work. Also don't prepend the necessary prepending of $CWD > to those paths. Oops, I meant "don't forget to prepend ...".

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Mauro Tortonesi <[EMAIL PROTECTED]> writes: > Hrvoje Niksic wrote: >> Arne Caspari <[EMAIL PROTECTED]> writes: >> >> I believe that CWD is mandated by the FTP specification, but you're >> also right that Wget should try both variants. > > i agree. perhaps when retrieving file A/B/F.X we should try

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Thank you all for your very fast response. As a further note: When this error occurs, wget bails out with the following error message: "No such directory foo/bar". I think it should instead be "Could not access foo/bar: Permission denied" or similar in such a situation. /Arne Mauro Tortones

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Mauro Tortonesi
Hrvoje Niksic wrote: Arne Caspari <[EMAIL PROTECTED]> writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET A/B/F.X first, then: CWD A/B GET F.X if t

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Arne Caspari <[EMAIL PROTECTED]> writes: > When called like: > wget user:[EMAIL PROTECTED]/foo/bar/file.tgz > > and foo or bar is a read/execute protected directory while file.tgz is > user-readable, wget fails to retrieve the file because it tries to CWD > into the directory first. > > I think th

wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Hello, current wget seems to have the following bug in the ftp retrieval code: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD into the

wget bug

2005-10-03 Thread Michael C. Haller
Begin forwarded message: From: [EMAIL PROTECTED] Date: October 4, 2005 4:36:09 AM GMT+02:00 To: [EMAIL PROTECTED] Subject: failure notice Hi. This is the qmail-send program at sunsite.dk. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent erro

wget bug when using proxy, https, & digest authentication

2005-07-21 Thread Corey Wright
all patches are against wget 1.10. please cc me on all responses as i am not subscribed to this list. FIRST BUG there is a bug in http.c. when connecting by way of proxy & https, if digest authentication is necessary, then the first connection attempt fails and we go to retry_with_auth. that m

Re: wget bug report

2005-06-24 Thread Hrvoje Niksic
<[EMAIL PROTECTED]> writes: > Sorry for the crosspost, but the wget Web site is a little confusing > on the point of where to send bug reports/patches. Sorry about that. In this case, either address is fine, and we don't mind the crosspost. > After taking a look at it, i implemented the followi

wget bug report

2005-06-12 Thread A.Jones
Sorry for the crosspost, but the wget Web site is a little confusing on the point of where to send bug reports/patches. Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with the following error (once for each wget run): Assertion failed: wget_cookie_jar != NULL, file http

Wget Bug

2005-04-26 Thread Arndt Humpert
Hello, wget, win32 rel. crashes with huge files. regards [EMAIL PROTECTED] ___ Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de==> Command Line wget -m ftp://f

Re: Wget Bug

2005-04-26 Thread Hrvoje Niksic
Arndt Humpert <[EMAIL PROTECTED]> writes: > wget, win32 rel. crashes with huge files. Thanks for the report. This problem has been fixed in the latest version, available at http://xoomer.virgilio.it/hherold/ .

Re: WGET Bug?

2005-04-04 Thread Hrvoje Niksic
"Nijs, J. de" <[EMAIL PROTECTED]> writes: > # > C:\Grabtest\wget.exe -r --tries=3 http://www.xs4all.nl/~npo/ -o > C:/Grabtest/Results/log > # > --16:23:02-- http://www.x

WGET Bug?

2005-04-04 Thread Nijs, J. de
Title: WGET Bug? # C:\Grabtest\wget.exe -r --tries=3 http://www.xs4all.nl/~npo/ -o C:/Grabtest/Results/log # --16:23:02--  http://www.xs4all.nl/%7Enpo

Wget bug

2005-02-02 Thread Vitor Almeida
OS = Solaris 8 Platform = Sparc   Test command = /usr/local/bin/wget -r -t0 -m ftp://root:[EMAIL PROTECTED]/usr/openv/var The directory will count to some sub-direcotry's  and files to synchronize.   Example :   # ls -la /usr/openv/total 68462drwxr-xr-x  14 root bin  512 set 

Re: wget bug: spaces in directories mapped to %20

2005-01-17 Thread Jochen Roderburg
Zitat von Tony O'Hagan <[EMAIL PROTECTED]>: > Original path: abc def/xyz pqr.gif > After wget mirroring: abc%20def/xyz pqr.gif (broken link) > > wget --version is GNU Wget 1.8.2 > This was a "well-known error" in the 1.8 versions of wget, which is already corrected in th

wget bug: spaces in directories mapped to %20

2005-01-16 Thread Tony O'Hagan
Recently I used the following wget command under a hosted linux account: $ wget -mirror -o mirror.log The web site contained files and virtual directories that contained spaces in the names. URL encoding translated these spaces to %20. wget correctly URL decoded the file names (creating file na

wget bug

2005-01-15 Thread Matthew F. Dennis
It seems that wget uses a signed 32 bit value for the content-length in HTTP. I haven't looked at the code, but it appears that this is what is happening. The problem is that when a file larger than about 2GB is downloaded, wget reports negative numbers for it's size and quits the download right

wget bug with large files

2004-12-10 Thread Roberto Sebastiano
I got a crash in wget downloading a large iso file (2,4 GB) newdeal:/pub/isos# wget -c ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso --09:22:17-- ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso => `FC3-i386-DVD.iso' Resolving ftp.belnet

I want to report a wget bug

2004-11-24 Thread jiaming
Hello! I am very pleased to use wget to crawl pages. It is an excellent tool. Recently I find a bug in using wget, although I am not sure wether it's a bug or an incorrect usage. I just to want to report here. When I use wget to mirror or recursively download a web site with -O option, I m

wget -- bug / feature request (not sure)

2004-09-04 Thread Vlad Kudelin
Hello, Probably I am just too lazy, haven't spent enough time to read the man, and wget can actually do exactly what I want. If so -- I do apologize for taking your time. Otherwise: THANKS for your time!..:-). My problem is: redirects. I am trying to catch them by using, say, netcat ...

Re: wget bug with ftp/passive

2004-08-12 Thread Jeff Connelly
On Wed, 21 Jan 2004 23:07:30 -0800, you wrote: >Hello, >I think I've come across a little bug in wget when using it to get a file >via ftp. > >I did not specify the "passive" option, yet it appears to have been used >anyway Here's a short transcript: Passive FTP can be specified in /etc/wgetrc

wget bug: directory overwrite

2004-04-05 Thread Juhana Sadeharju
Hello. Problem: When downloading all in http://udn.epicgames.com/Technical/MyFirstHUD wget overwrites the downloaded MyFirstHUD file with MyFirstHUD directory (which comes later). GNU Wget 1.9.1 wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Solution: Use of -E o

wget bug report

2004-03-26 Thread Corey Henderson
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it bounced and said to try this email address. This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9 rpm -q wget wget-1.8.2-9 When I use a wget with the -S to show the http headers, and I use th

wget bug in retrieving large files > 2 gig

2004-03-09 Thread Eduard Boer
Hi, While downloading a file of about 3,234,550,172 bytes with "wget http://foo/foo.mpg"; I get an error: HTTP request sent, awaiting response... 200 OK Length: unspecified [video/mpeg] [ <=> ] -1,060,417,124 13.10

Re: wget bug with ftp/passive

2004-01-22 Thread Hrvoje Niksic
don <[EMAIL PROTECTED]> writes: > I did not specify the "passive" option, yet it appears to have been used > anyway Here's a short transcript: > > [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip > --21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip >

wget bug with ftp/passive

2004-01-21 Thread don
Hello, I think I've come across a little bug in wget when using it to get a file via ftp. I did not specify the "passive" option, yet it appears to have been used anyway Here's a short transcript: [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip --21:05:21-- ftp://m

Re: wget bug

2004-01-12 Thread Hrvoje Niksic
Kairos <[EMAIL PROTECTED]> writes: > $ cat wget.exe.stackdump [...] What were you doing with Wget when it crashed? Which version of Wget are you running? Was it compiled for Cygwin or natively for Windows?

wget bug

2004-01-06 Thread Kairos
$ cat wget.exe.stackdump Exception: STATUS_ACCESS_VIOLATION at eip=77F51BAA eax= ebx= ecx=0700 edx=610CFE18 esi=610CFE08 edi= ebp=0022F7C0 esp=0022F74C program=C:\nonspc\cygwin\bin\wget.exe cs=001B ds=0023 es=0023 fs=0038 gs= ss=0023 Stack trace: Frame Function

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
"Kempston" <[EMAIL PROTECTED]> writes: > Yeah, i understabd that, but lftp hadles it fine even without > specifying any additional option ;) But then lftp is hammering servers when real unauthorized entry occurs, no? > I`m sure you can work something out Well, I'm satisfied with what Wget does

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
The problem is that the server replies with "login incorrect", which normally means that authorization has failed and that further retries would be pointless. Other than having a natural language parser built-in, Wget cannot know that the authorization is in fact correct, but that the server happe

Wget Bug

2003-11-10 Thread Kempston
Here is debug output :/FTPD# wget ftp://ftp.dcn-asu.ru/pub/windows/update/winxp/xpsp2-1224.exe -d DEBUG output created by Wget 1.8.1 on linux-gnu. --13:25:55-- ftp://ftp.dcn-asu.ru/pub/windows/upd

Re: dificulty with Debian wget bug 137989 patch

2003-09-30 Thread Hrvoje Niksic
"jayme" <[EMAIL PROTECTED]> writes: [...] Before anything else, note that the patch originally written for 1.8.2 will need change for 1.9. The change is not hard to make, but it's still needed. The patch didn't make it to canonical sources because it assumes `long long', which is not available o

dificulty with Debian wget bug 137989 patch

2003-09-29 Thread jayme
I tried the patch Debian bug report 137989 and didnt work. Can anybody explain: 1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one wget-1.8.2 ? 2 - why after compilation the wget still cant download the file > 2GB ? note : I cut the patch for debian use ( the first d

Re: wget bug

2003-09-26 Thread Hrvoje Niksic
Jack Pavlovsky <[EMAIL PROTECTED]> writes: > It's probably a bug: bug: when downloading wget -mirror > ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, > but when downloading wget ftp://somehost.org/somepath/3*, wget saves > the files as 3acv14%7Eanivcd.mpg Thanks for the repor

Re: wget bug

2003-09-26 Thread DervishD
Hi Jack :) * Jack Pavlovsky <[EMAIL PROTECTED]> dixit: > It's probably a bug: > bug: when downloading > wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, > wget saves it as-is, but when downloading > wget ftp://somehost.org/somepath/3*, wget saves the files as > 3acv14%7Eanivcd.

wget bug

2003-09-26 Thread Jack Pavlovsky
It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg -- The human knowledge belongs to the world

Re: wget bug: mirror doesn't delete files deleted at the source

2003-08-01 Thread Aaron S. Hawley
On Fri, 1 Aug 2003, Mordechai T. Abzug wrote: > I'd like to use wget in mirror mode, but I notice that it doesn't > delete files that have been deleted at the source site. Ie.: > > First run: the source site contains "foo" and "bar", so the mirror now > contains "foo" and "bar". > > Before

wget bug: mirror doesn't delete files deleted at the source

2003-07-31 Thread Mordechai T. Abzug
I'd like to use wget in mirror mode, but I notice that it doesn't delete files that have been deleted at the source site. Ie.: First run: the source site contains "foo" and "bar", so the mirror now contains "foo" and "bar". Before second run: the source site deletes "bar" and replaces it

wget bug?

2003-02-22 Thread Marian Förster
hallo! i use wget to transfer www pages :-) but i found following bug: if there is a directory f.I. .../anwendungen/CLIC wget tranfer the structure to .../anwendungen/clic, but the links in the www pages stay incorrect, that means the link href="anwendungen/CLIC" is after transfer false or sho

Re: wget bug

2002-11-05 Thread Jeremy Hetzler
At 09:20 AM 11/5/2002 -0700, Jing Ping Ye wrote: Dear Sir: I tried to use "wget" download data from ftp site but got error message as following: > wget ftp://ftp.ngdc.noaa.gov/pub/incoming/RGON/anc_1m.OCT Screen show: -

wget bug

2002-11-05 Thread Jing Ping Ye
Dear Sir: I tried to use "wget" download data from ftp site but got error message as following: > wget  ftp://ftp.ngdc.noaa.gov/pub/incoming/RGON/anc_1m.OCT Screen show: ---

wget bug (?): --page-requisites should supercede robots.txt

2002-09-22 Thread Jamie Flournoy
Using wget 1.8.2: $ wget --page-requisites http://news.com.com ...fails to retrieve most of the files that are required to properly render the HTML document, because they are forbidden by http://news.com.com/robots.txt . I think that use of --page-requisites implies that wget is being used as

Wget Bug: Re: not downloading everything with --mirror

2002-08-15 Thread Max Bowsher
Funk Gabor wrote: >> HTTP does not provide a dirlist command, so wget parses html to find >> other files it should download. Note: HTML not XML. I suspect that >> is the problem. > > If wget wouldn't download the rest, I'd say that too. But 1st the dir > gets created, the xml is dloaded (in some o

Wget bug: 32 bit int for "bytes downloaded".

2002-08-04 Thread Rogier Wolff
It seems wget uses a 32 bit integer for the "bytes downloaded": [...] FINISHED --17:11:26-- Downloaded: 1,047,520,341 bytes in 5830 files cave /home/suse8.0# du -s 5230588 . cave /home/suse8.0# As it's a "once per download" variable I'd say it's not that performance critical...

WGET BUG

2002-07-07 Thread Kempston
  Downloaded: 150,000,000 bytes in 10 files   - Original Message - From: Kempston To: [EMAIL PROTECTED] Sent: Monday, July 08, 2002 12:50 AM Subject: WGET BUG     Hi, i have a problem and would really like you to help me. i`m using wget for downloading list of file ur

WGET BUG

2002-07-07 Thread Kempston
    Hi, i have a problem and would really like you to help me. i`m using wget for downloading list of file urls via http proxy. When proxy server goes offline - wget doesn`t retry downloading of files. Can you fix that or can you tell me how can i fix that ?

Re: wget bug (overflow)

2002-04-15 Thread Hrvoje Niksic
I'm afraid that downloading files larger than 2G is not supported by Wget at the moment.

wget bug (overflow)

2002-02-26 Thread Vasil Dimov
fbsd1 --- http wget eshop.tar (3.3G) ---> fbsd2 command was: # wget http://kamenica/eshop.tar at the second G i got the following: 2097050K .. .. .. .. .. 431.03 KB/s 2097100K .. .. .. .. ..8.14 MB/s 2097150K

Re: wget bug?!

2002-02-19 Thread TD - Sales International Holland B.V.
On Monday 18 February 2002 17:52, you wrote: That would be great. The prob is that I'm using it to retrieve files mostly on servers that are having too much users. No I don't want to hammer the server but I do want to keep on trying with reasonable intervals until I get the file. I think the

Re: wget bug?!

2002-02-18 Thread Ian Abbott
[The message I'm replying to was sent to <[EMAIL PROTECTED]>. I'm continuing the thread on <[EMAIL PROTECTED]> as there is no bug and I'm turning it into a discussion about features.] On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote: > I've tried -w 30 > --waitretry=30 > --wa

wget bug?!

2002-02-18 Thread TD - Sales International Holland B.V.
Hey there, I wanna download a file at mustek's ftp site in america. This site has a 20 users limit. Have a look at this: bash-2.05# wget --wait=30 --waitretry=30 -t 0 ftp://128.121.112.104/pub/1200UBXP/Web.EXE --15:10:37-- ftp://128.121.112.104/pub/1200UBXP/Web.EXE => `Web.EXE' Con

Re: [Wget]: Bug submission

2001-12-29 Thread Hrvoje Niksic
[ Please mail bug reports to <[EMAIL PROTECTED]>, not to me directly. ] Nuno Ponte <[EMAIL PROTECTED]> writes: > I get a segmentation fault when invoking: > > wget -r > http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html > > My Wget version is 1.7-3, the one w

Re: wget bug (?)

2001-11-15 Thread Ian Abbott
On 14 Nov 2001, at 13:20, Bernard, Shawn wrote: > I'm not sure if this is a bug or not, but when I ran this line: > wget -r -l2 http://www.turnerclassicmovies.com/NowPlaying/Index > I get this result: (snip) > `www.turnerclassicmovies.com/Home/Index/0,3436,,00.html' saved [27179] > >

wget bug (?)

2001-11-14 Thread Bernard, Shawn
Title: wget bug (?) I'm not sure if this is a bug or not, but when I ran this line:     wget -r -l2 http://www.turnerclassicmovies.com/NowPlaying/Index I get this result: === 13:11:02 (5.62 MB/s) - `www.turnerclassicmovies.com/NowPlaying/

wget bug

2001-10-10 Thread Muthu Swamy
HI, When I try to send a page to Nextel mobile using the following command from unix box, "wget http://www.nextel.com/cgi-bin/sendPage.cgi?to01=4157160856%26message=hellothere%26action=send" The wget returns the following message but the page is not reaching the phone. "--15:59:16-- http://www.n

wget bug

2001-10-08 Thread Dmitry . Karpov
Dear sir. When I out to my browser (NN'3) line http://find.infoart.ru/cgi-bin/yhs.pl?hidden=http%3A%2F%2F194.67.26.82&word=FreeBSD wget working correctly. When I put this line to wget, wget change this line; argument hidden is "http:/194.67.26.82&word", argument word is empty. Where I am wrong?

Re: wget bug?

2001-06-14 Thread Jan Prikryl
"Story, Ian" wrote: > > I have been a very happy user of wget for a long time. However, today I > > noticed that some sites, that don't run on port 80, don't work well with > > wget. For instance, when I tell wget to go get http://www.yahoo.com, it > > automatically puts :80 at the end, like th

wget bug?

2001-06-13 Thread Story, Ian
> Hello, > I have been a very happy user of wget for a long time. However, today I > noticed that some sites, that don't run on port 80, don't work well with > wget. For instance, when I tell wget to go get http://www.yahoo.com, it > automatically puts :80 at the end, like this: http://www.yahoo

Re: maybe wget bug

2001-04-23 Thread Hrvoje Niksic
Hack Kampbjørn <[EMAIL PROTECTED]> writes: > You have hit one of Wget "features", it is overzealous in converting > URLs into canonical form. As you have discovered Wget first converts > all encoded characters back to their real value and then encodes all > those that are unsafe sending in URLs.

Re: maybe wget bug

2001-04-18 Thread Hack Kampbjørn
David Christopher Asher wrote: > > Hello, > > I am using wget to invoke a CGI script call, while passing it several > variables. For example: > > wget -O myfile.txt > "http://user:[EMAIL PROTECTED]/myscript.cgi?COLOR=blue&SHAPE=circle" > > where myscript.cgi say, makes an image based on the p

maybe wget bug

2001-04-04 Thread David Christopher Asher
Hello, I am using wget to invoke a CGI script call, while passing it several variables. For example: wget -O myfile.txt "http://user:[EMAIL PROTECTED]/myscript.cgi?COLOR=blue&SHAPE=circle" where myscript.cgi say, makes an image based on the parameters "COLOR" and "SHAPE". The problem I am hav

Re: wget bug - after closing control connection

2001-03-08 Thread csaba . raduly
Which version of wget do you use ? Are you aware that wget 1.6 has been released and 1.7 is in development (and they contain a workaround for the "Lying FTP server syndrome" you are seeing) ? -- Csaba Ráduly, Software Engineer Sophos Anti-Virus email: [EMAIL PROTECTED]

wget bug - after closing control connection

2001-03-08 Thread Cezary Sobaniec
Hello, I've found a (less important) bug in wget. I've been dowloading a file from FTP server and the control connection of the FTP service was closed by the server. After that wget started to print incorrectly progress information (beyond 100%). The log follows:

Re: wget bug with following to a new location

2001-01-23 Thread Hack Kampbjørn
Volker Kuhlmann wrote: > > I came across this bug in wget where it gives an error instead of > following, as it should. > > Volker > > > wget --version > GNU Wget 1.5.3 Hmm that's quite old ... > > Copyright (C) 1995, 1996, 1997, 1998 Free Software Foundation, Inc. > This program is distribute

wget bug with following to a new location

2001-01-22 Thread Volker Kuhlmann
I came across this bug in wget where it gives an error instead of following, as it should. Volker > wget --version GNU Wget 1.5.3 Copyright (C) 1995, 1996, 1997, 1998 Free Software Foundation, Inc. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; withou