Possible bug when downloading gzipped content

2004-12-30 Thread Christoph Anton Mitterer
Hello. I've tried to download some files from joecartoon.com an perhaps I've found a bug in wget. I did the following: [EMAIL PROTECTED]:~/test$ wget -S http://joecartoon.atomfilms.com/media/swf/0/public/1/joebutton.swf deb --22:57:11-- http://joecartoon.atomfilms.com/media/swf

Re: Possible bug when downloading gzipped content

2004-12-30 Thread Christoph Anton Mitterer
the correct swf-file. So I think thats not a bug of wget. But I urgently suggest that in such a case wget should come up with a very big warning message or something like this ;) Regards, Christoph. begin:vcard fn:Mitterer, Christoph Anton n:Mitterer;Christoph Anton org:Munich University of Applied

bug

2004-12-26 Thread szulevzs
WGET can not download the following link: Wget --tries=5 http://extremetracking.com/free-2/scripts/reports/display/edit?server=clogin=flashani I tested it with other downloader and it was working.

Bug: Proxy and Content-Len 0 stall on accept()

2004-12-26 Thread Antonio Zerbinati
Hi, I've experienced this bug, while retriving a 0 byte document via http proxy.. wget localhost/~antonio/ stalls after displaing the string Lengh: 0 [text/html] (gdb said it was on accept() ) wget --use-proxy off localhost/~antonio/ exits correctly wget is the HEAD version of CVS repository

bug while handling big files

2004-12-24 Thread Leonid
Hi, Simone, Santa put a patch for you in http://software.lpetrov.net/wget-LFS/ Unwrap carefully and enjoy. Merry Christmas, Leonid 24-DEC-2004 21:02:03

bug while handling big files

2004-12-23 Thread Simone Bastianello
Hello. I was retrieving this iso: ftp://ftp.slackware.no/pub/linux/ISO-images/Slackware/Current-ISO-build/slackware-10.0-DVD.iso I killed wget and then I resumed it with wget -c (file was downlaoded for 2285260288 bytes) here's the output: --19:31:47--

wget bug with large files

2004-12-10 Thread Roberto Sebastiano
tracking this bug Thanks, -- Roberto Sebastiano [EMAIL PROTECTED]

-O- and -c bug

2004-12-02 Thread Bob Arctor
hello. recently i tried to downoad a link directly to an CD by using wget -O- -c http://site/file.iso | cdrecord dev=[device] - everything was fine to the moment when connection died, wget started downloading of the file _from byte 0_ it seems -c doesn't recognize that when one uses -O-

Re: new bug tracking system for GNU wget

2004-12-01 Thread Noèl Köthe
of the development effort behind GNU wget. Great. I will forward reported bugs from the Debian user (http://bugs.debian.org/wget) to your bug system. if i don't find any major problem, i am planning to release wget 1.9.2 with LFS support and a long list of bugfixes before the end of the year

Re: new bug tracking system for GNU wget

2004-12-01 Thread Mauro Tortonesi
. hi noèl, very simple: because i don't have the root password on savannah ;-) BTW, i also find that roundup is a pretty cool bug trackind system. very simple to use and to maintain (bugzilla and RT would have been overkill for wget), extremely flexible (can be used with almost any open source db

RE: new bug tracking system for GNU wget

2004-12-01 Thread David M. Bennett
if i don't find any major problem, i am planning to release wget 1.9.2 with LFS support and a long list of bugfixes before the end of the year. Are you planning to fix session cookies? In the current release version they don't work. In the tip build they nearly work, but I got problems

new bug tracking system for GNU wget

2004-11-30 Thread Mauro Tortonesi
hi to everybody, thanks to the kind hosting provided by the ferrara linux user group, i have finally been finally able to set up a bug tracking system for GNU wget: http://wget-bugs.ferrara.linux.it it will definitely be an invaluable tool for the coordination of the development effort

not a bug but a usefull function I didn't see : the filename extracted from http server headers...

2004-11-27 Thread Anti-MicroSoft Trusts
Hello, On a few website, the filename of file is extracted from http server response headers. I did a bash script to get the real filename from a wget -sS command, cause I didn't find that in options... I wonder if that option would be usefull directly in wget options to obtain the filename

I want to report a wget bug

2004-11-24 Thread jiaming
Hello! I am very pleased to use wget to crawl pages. It is an excellent tool. Recently I find a bug in using wget, although I am not sure wether it's a bug or an incorrect usage. I just to want to report here. When I use wget to mirror or recursively download a web site with -O option, I

[bug] Absence of newline char after PORT with ftp and --spider

2004-11-10 Thread Adam Wysocki
Hi, When wget 1.9.1, issued with --spider option, is given a ftp link (or http link redirecting it to ftp site), it doesn't put a newline char after doing PORT command. gophi (not subscribed to list). -- Adam Wysocki * http://www.gophi.apcoh.org/ * GG 1234 * GSM 508878856

on tilde bug

2004-11-01 Thread Juhana Sadeharju
the url with ~ and the url with %7E downloaded files differently! I also added new log outputs and while testing them with the problem sites, surprise, there seemed to be no problems. So, the fact that urls are not downloaded, could be just some code bug in wget. But why this problem appears when

bug regarding recursive retrieval of login/pass protected websites - bad referer

2004-10-31 Thread Martin Vogel
Hi, i'm using wget 1.9.1 andgot aproblem: when using wget -r -d --referer='http://domain.invalid/login.htm' 'http://user:[EMAIL PROTECTED]://domain.invalid/members/'the first request is properly, but the third and following(second is the one for robots.txt) sends an incorrect referer:

Tilde bug again

2004-10-16 Thread Juhana Sadeharju
Hello. Has the ~ / %7E bug been always in wget? When it was added to wget? Who wrote the code? I would like to suggest that the person who made this severe bug should immediately fix it back. It does not make sense that we waste time in trying to fix this bug if the person did not use any moment

bug with -k option and unquoted href params

2004-10-12 Thread Chris Petersen
the -k option seems to ignore unquoted href parameters. Bad form though they are, there are a LOT of pages out there that have these: a href=foo.htmlfoo/a -Chris

Report Bug Wget

2004-10-06 Thread christian . faeh
instead of two folders def\ghi. Maybe it's not a bug, but I think it could be a feature to convert this char in next version. with kind regards chris faeh

Bug in -np option

2004-09-16 Thread Heiko Selber
There is a bug in the -np option (don't ascend to the parent directory) of wget 1.9.1: When the URL ends in a slash (/), it works OK, but when the slash is missing, wget apparently doesn't care about the option and happily continues above the parent directory. Compare these two lines (only

bug in log_close()

2004-09-15 Thread Nicolas Deves
wget-1.9.1 in file log.c, in function log_close() : - if (logfp) fclose (logfp); - closes the logfile file descriptor even if it is stderr ! i think we should test like this : if (logfp logfp != stderr)

progress reporting bug

2004-09-07 Thread Phillip Nordwall
Title: Message When using wget on a large file e.g. (wget -c ftp://ftp.tu-chemnitz.de/pub/linux/knoppix-remastered/knoppix-34-dvd-by-iso-top.info.iso) around 2.5gigs I get the following progress bar [ = ] -1,852,766,472 343.26K/s and later [ = ] -1,842,437,408 351.40K/s I noticed

wget -- bug / feature request (not sure)

2004-09-04 Thread Vlad Kudelin
not behave as documented, it's a bug. -- according to man, -- I am taking a liberty to 'file a bug'. (The expected behavior I'm talking about is this: if I use --spider, I expect wget do nothing after finding the server -- like sending GET to the server and getting HTML back). That's my bug

bug / overflow issue

2004-09-02 Thread Leonid
Patrik, Patch for wget with large file support (2Gb) under Unix can be found at http://software.lpetrov.net/wget-LFS/ Leonid

bug / overflow issue

2004-09-01 Thread Patrik Sjöberg
hi ive found the following bug / issue with wget. due to limitations wget bugs on files larger then unsigned long and displays incorrect size and also acts incorrectly when trying to download one of these files. //patrik

Re: Bug#261755: Control sequences injection patch

2004-08-23 Thread Jan Minar
') { + /* +* I've spotted wget printing CRLF line terminators +* while communicating with ftp://ftp.debian.org. This +* is a bug: wget should print whatever the platform +* line terminator is (CR on Mac

Re: Bug#261755: Control sequences injection patch

2004-08-22 Thread Jan Minar
tags 261755 +patch thanks On Sun, Aug 22, 2004 at 11:39:07AM +0200, Thomas Hood wrote: The changes contemplated look very invasive. How quickly can this bug be fixed? Here we go: Hacky, non-portable, but pretty slick non-invasive, whatever that means. Now I'm going to check whether

Re: wget bug with ftp/passive

2004-08-12 Thread Jeff Connelly
On Wed, 21 Jan 2004 23:07:30 -0800, you wrote: Hello, I think I've come across a little bug in wget when using it to get a file via ftp. I did not specify the passive option, yet it appears to have been used anyway Here's a short transcript: Passive FTP can be specified in /etc/wgetrc or /usr

[Fwd: Bug#197916: wget: Mutual incompatibility between arguments -k and -O]

2004-08-11 Thread Noèl Köthe
Hello, here a bugreport: (http://bugs.debian.org/197916) -Weitergeleitete Nachricht- From: Antoni Bella Perez [EMAIL PROTECTED] To: [EMAIL PROTECTED] Subject: Bug#197916: wget: Mutual incompatibility between arguments -k and -O Date: Wed, 18 Jun 2003 16:49:22 +0200 Package: wget

[Fwd: Bug#182957: wget: manual page doesn't document type of patterns for --rejlist, --acclist]

2004-08-11 Thread Noèl Köthe
Hello, maybe someone can document this (http://bugs.debian.org/182957) in one or two sentences in wget.texi. thx. -Weitergeleitete Nachricht- From: Daniel B. dsb smart.net ... The wget manual page doesn't document the format of the comma-separated values for the --rejlist and

Re: Bug in wget 1.9.1 documentation

2004-07-12 Thread Hrvoje Niksic
Tristan Miller [EMAIL PROTECTED] writes: There appears to be a bug in the documentation (man page, etc.) for wget 1.9.1. I think this is a bug in the man page generation process.

May bee report of the bug

2004-07-12 Thread Valdas Kondrotas
Hello, I think wget cannot to store more than one cookie at the time. This is a bug? Installed from wget-cvs_1.9.1-20040319_i386.deb Some log entries following: Best regards, Valdas DEBUG output created by Wget 1.9+cvs-dev on linux-gnu. Created socket 8. Releasing 0x8090868 (new

Bug in wget 1.9.1 documentation

2004-07-11 Thread Tristan Miller
Greetings. There appears to be a bug in the documentation (man page, etc.) for wget 1.9.1. Specifically, the section about the command-line option for proxies ends abruptly: -Y on/off --proxy=on/off Turn proxy support on or off. The proxy is on by default

wget overwriting even with -c (bug?)

2004-06-12 Thread Petr Kadlec
Hi folks! Sometimes I experience very unpleasant behavior of wget (using some not-really-recent CVS version of wget 1.9, under W98SE). I have a partially downloaded file (usually a big one, there is not so big probability of interrupted download of a small file), so I want to finish the

Re: wget overwriting even with -c (bug?)

2004-06-12 Thread Petr Kadlec
H, sorry, I have just discovered that it has been reported about a week ago (http://www.mail-archive.com/wget%40sunsite.dk/msg06527.html). I really did try to search for some overwrite, etc. in the archive, honestly. :-) But that e-mail does not use the word overwrite at all... Regards,

Re: [BUG] wget 1.9.1 and below can't download =2G file on 32bits system

2004-05-27 Thread Hrvoje Niksic
Yup; 1.9.1 cannot download large files. I hope to fix this by the next release.

Re: Maybe a bug or something else for wget

2004-05-24 Thread Jens Rösner
Hi Ben! Not at a bug as far as I can see. Use -A to accept only certain files. Furthermore, the pdf and ppt files are located across various servers, you need to allow wget to parse other servers than the original one by -H and then restrict it to only certain ones by -D. wget -nc -x -r -l2 -p

[BUG] wget 1.9.1 and below can't download =2G file on 32bits system

2004-05-24 Thread Zhu, Yi
Hi, I use wget on a i386 redhat 9 box to download 4G DVD from a ftp site. The process stops at: $ wget -c --proxy=off ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso --12:47:24-- ftp://redhat.com/pub/fedora/linux/core/2/i386/iso/FC2-i386-DVD.iso =

Maybe a bug or something else for wget

2004-05-23 Thread Gao, Ruidong
Hi, How can I download all pdf and ppt file by the following url with command line of: wget -k -r -l 1 http://devresource.hp.com/drc/topics/utility_comp.jsp I am on windows 2000 server sp4 with latest update. E:\Releasewget -V GNU Wget 1.9.1 Copyright (C) 2003 Free Software Foundation,

Bug report: two spaces between filesize and Month

2004-05-03 Thread Iztok Saje
Hello! I just found a feature in embedded system (no source) with ftp server. In listing, there are two spaces between fileize and month. As a consequence, wget allways thinks size is 0. In procedure ftp_parse_unix_ls it just steps back one blank before cur.size is calculated. My quick hack is

May not be a Bug more a nice-2-have

2004-04-07 Thread Alexander Joerg Herrmann
Dear Reader, some may not really consider it a Bug so it is maybe more a nice-2-have When I try to mirror the Internetpages I develop http://www.nachttraum.de http://www.felixfrisch.de wget complains that the linux complains that the file name is to long. It is not exactly a Bug as I use cgi

wget bug: directory overwrite

2004-04-05 Thread Juhana Sadeharju
Hello. Problem: When downloading all in http://udn.epicgames.com/Technical/MyFirstHUD wget overwrites the downloaded MyFirstHUD file with MyFirstHUD directory (which comes later). GNU Wget 1.9.1 wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Solution: Use of -E

[BUG?] --include option does not use an exact match for directories

2004-03-28 Thread William Bresler
, something which I cannot control. So, again, I say this is a bug. I see that frontcmp() is also called by (recur.c)download_child_p which is an HTTP function, so any possible patch would probably need to just create a new function in utils.c solely for use in FTP directory matching. It's only

wget bug report

2004-03-26 Thread Corey Henderson
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it bounced and said to try this email address. This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9 rpm -q wget wget-1.8.2-9 When I use a wget with the -S to show the http headers, and I use

Bug report

2004-03-24 Thread Juhana Sadeharju
Hello. This is report on some wget bugs. My wgetdir command looks the following (wget 1.9.1): wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@ Bugs: Command: wgetdir http://www.directfb.org;. Problem: In file www.directfb.org/index.html the hrefs of type

Re: Bug report

2004-03-24 Thread Hrvoje Niksic
Juhana Sadeharju [EMAIL PROTECTED] writes: Command: wgetdir http://liarliar.sourceforge.net;. Problem: Files are named as content.php?content.2 content.php?content.3 content.php?content.4 which are interpreted, e.g., by Nautilus as manual pages and are displayed as plain texts. Could

wget bug in retrieving large files 2 gig

2004-03-09 Thread Eduard Boer
Hi, While downloading a file of about 3,234,550,172 bytes with wget http://foo/foo.mpg; I get an error: HTTP request sent, awaiting response... 200 OK Length: unspecified [video/mpeg] [ = ] -1,060,417,124 13.10M/s

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-05 Thread Hrvoje Niksic
; the bug only happens when the url is passed in with: cat EOF | wget -i - http://... EOF But I cannot repeat that, either. As long as the consecutive slashes are in the query string, they're not stripped. Using this method is necessary since it is the ONLY secure way I know of to do

bug in use index.html

2004-03-04 Thread Василевский Сергей
Good day! I use wget 1.9.1. By default all link to root site / or somedomain.com/ wget convert to /index.html or somedomain.com/index.html. But some site don't use index.html as default page and if use timestamp and continue download site in more than 1 session 1. wget first download index.html

Re: bug in use index.html

2004-03-04 Thread Hrvoje Niksic
The whole matter of conversion of / to /index.html on the file system is a hack. But I really don't know how to better represent empty trailing file name on the file system.

Re: bug in use index.html

2004-03-04 Thread Dražen Kačar
Hrvoje Niksic wrote: The whole matter of conversion of / to /index.html on the file system is a hack. But I really don't know how to better represent empty trailing file name on the file system. Another, for now rather limited, hack: on file systems which support some sort of file attributes

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-04 Thread D Richard Felker III
. Then I remembered that I was using -i. Wget seems to work fine with the url on the command line; the bug only happens when the url is passed in with: cat EOF | wget -i - http://... EOF Using this method is necessary since it is the ONLY secure way I know of to do a password-protected http request

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread Hrvoje Niksic
D Richard Felker III [EMAIL PROTECTED] writes: The following code in url.c makes it impossible to request urls that contain multiple slashes in a row in their query string: [...] That code is removed in CVS, so multiple slashes now work correctly. Think of something like

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread D Richard Felker III
On Mon, Mar 01, 2004 at 03:36:55PM +0100, Hrvoje Niksic wrote: D Richard Felker III [EMAIL PROTECTED] writes: The following code in url.c makes it impossible to request urls that contain multiple slashes in a row in their query string: [...] That code is removed in CVS, so multiple

Re: Bug in wget: cannot request urls with double-slash in the query string

2004-03-01 Thread Hrvoje Niksic
D Richard Felker III [EMAIL PROTECTED] writes: Think of something like http://foo/bar/redirect.cgi?http://... wget translates this into: [...] Which version of Wget are you using? I think even Wget 1.8.2 didn't collapse multiple slashes in query strings, only in paths. I was using

Bug in wget: cannot request urls with double-slash in the query string

2004-02-29 Thread D Richard Felker III
The following code in url.c makes it impossible to request urls that contain multiple slashes in a row in their query string: else if (*h == '/') { /* Ignore empty path elements. Supporting them well is hard (where do you save http://x.com///y.html;?), and

Re: bug in connect.c

2004-02-06 Thread Manfred Schwarb
Interesting. Is it really necessary to zero out sockaddr/sockaddr_in before using it? I see that some sources do it, and some don't. I was always under the impression that, as long as you fill the relevant members (sin_family, sin_addr, sin_port), other initialization is not necessary. Was I

Re: bug in connect.c

2004-02-06 Thread Hrvoje Niksic
Manfred Schwarb [EMAIL PROTECTED] writes: Interesting. Is it really necessary to zero out sockaddr/sockaddr_in before using it? I see that some sources do it, and some don't. I was always under the impression that, as long as you fill the relevant members (sin_family, sin_addr, sin_port),

Re: bug in connect.c

2004-02-04 Thread Hrvoje Niksic
francois eric [EMAIL PROTECTED] writes: after some test: bug is when: ftp, with username and password, with bind address specifyed bug is not when: http, ftp without username and password looks like memory leaks. so i made some modification before bind: src/connect.c

bug in connect.c

2004-02-03 Thread francois eric
) ready. ... -- after some test: bug is when: ftp, with username and password, with bind address specifyed bug is not when: http, ftp without username and password looks like memory leaks. so i made some modification before bind: src/connect.c: -- ... /* Bind the client side

BUG : problem of date with wget

2004-01-27 Thread Olivier RAMIARAMANANA (Ste Thales IS)
** High Priority ** Hi On My server AIX I use wget with this command /usr/local/bin/wget http://www.???.?? -O /exploit/log/test.log but when I read my file test.log its date it's January 30 2003 ??? that's incredible What's the problem please Regards olivier

Re: wget bug with ftp/passive

2004-01-22 Thread Hrvoje Niksic
don [EMAIL PROTECTED] writes: I did not specify the passive option, yet it appears to have been used anyway Here's a short transcript: [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip --21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip =

Re: wget bug

2004-01-12 Thread Hrvoje Niksic
Kairos [EMAIL PROTECTED] writes: $ cat wget.exe.stackdump [...] What were you doing with Wget when it crashed? Which version of Wget are you running? Was it compiled for Cygwin or natively for Windows?

wget bug

2004-01-06 Thread Kairos
$ cat wget.exe.stackdump Exception: STATUS_ACCESS_VIOLATION at eip=77F51BAA eax= ebx= ecx=0700 edx=610CFE18 esi=610CFE08 edi= ebp=0022F7C0 esp=0022F74C program=C:\nonspc\cygwin\bin\wget.exe cs=001B ds=0023 es=0023 fs=0038 gs= ss=0023 Stack trace: Frame Function

bug report

2003-12-30 Thread Vlada Macek
Hi again, I found something what can be called a bug. The command line and the output (shortened): $ wget -k www.seznam.cz --14:14:28-- http://www.seznam.cz/ = `index.html' Resolving www.seznam.cz... done. Connecting to www.seznam.cz[212.80.76.18]:80... connected. HTTP request sent

Maybe a bug?

2003-12-28 Thread James Li-Chung Chen
I'm playing around with the wget tool and I ran into this website that I don't believe the -e robots=off works. http://www.quickmba.com/ any idea why? I've tried a few combinations and I keep on getting this message in the response. We're sorry, but the way that you have attempted to

isn't it a little bug?

2003-12-23 Thread piotrek
Hi, I've just noticed a weird behavior of wget 1.8.2 while downloading a partial file with command: wget http://ardownload.adobe.com/pub/adobe/acrobatreader/unix/5.x/ linux-508.tar.gz -c The connection was very unstable, so it had to reconnect many times. What i noticed is not a big thing, just

bug? different behavior of wget and lwp-request (GET)

2003-12-17 Thread Diego Puppin
/group/sammydavisjr/message/56 retrieves a standard page (HTTP 200). Is this a bug (of GET, wget?) or a feature? I realized this problem when testing two different Java program to download pages from a URL. One uses a Java socket, the other uses Java URLConnection. Well, **even if the request

Bug in 1.9.1? ftp not following symlinks

2003-12-09 Thread Manfred Schwarb
hi i tried to download the following: wget ftp://ftp.suse.com/pub/suse/i386/7.3/full-names/src/traceroute-nanog_6.1.1-94.src.rpm this is a symbolic link. downloading just this single file, wget should follow the link, but it creates only a symbolic link. excerpt from man wget, section

Re: non-subscribers have to confirm each message to bug-wget

2003-11-18 Thread Hrvoje Niksic
Dan Jacobson [EMAIL PROTECTED] writes: And stop making me have to confirm each and every mail to this list. Hrvoje Currently the only way to avoid confirmations is to Hrvoje subscribe to the list. I'll try to contact the list owners Hrvoje to see if the mechanism can be improved.

Re: non-subscribers have to confirm each message to bug-wget

2003-11-17 Thread Dan Jacobson
And stop making me have to confirm each and every mail to this list. Hrvoje Currently the only way to avoid confirmations is to subscribe to the Hrvoje list. I'll try to contact the list owners to see if the mechanism can Hrvoje be improved. subscribe me with the nomail option, if it can't be

Wget Bug

2003-11-10 Thread Kempston
Here is debug output :/FTPD# wget ftp://ftp.dcn-asu.ru/pub/windows/update/winxp/xpsp2-1224.exe -d DEBUG output created by Wget 1.8.1 on linux-gnu. --13:25:55--

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
The problem is that the server replies with login incorrect, which normally means that authorization has failed and that further retries would be pointless. Other than having a natural language parser built-in, Wget cannot know that the authorization is in fact correct, but that the server

Re: Wget Bug

2003-11-10 Thread Hrvoje Niksic
Kempston [EMAIL PROTECTED] writes: Yeah, i understabd that, but lftp hadles it fine even without specifying any additional option ;) But then lftp is hammering servers when real unauthorized entry occurs, no? I`m sure you can work something out Well, I'm satisfied with what Wget does now.

Bug: Support of charcters like '\', '?', '*', ':' in URLs

2003-10-21 Thread Frank Klemm
Wget don't work properly when the URL contains characters which are not allowed in file names on the file system which is currently used. These are often '\', '?', '*' and ':'. Affected are at least: - Windows and related OS - Linux when using FAT or Samba as file system Possibilty to solve: On

Re: Bug: Support of charcters like '\', '?', '*', ':' in URLs

2003-10-21 Thread Hrvoje Niksic
Frank Klemm [EMAIL PROTECTED] writes: Wget don't work properly when the URL contains characters which are not allowed in file names on the file system which is currently used. These are often '\', '?', '*' and ':'. Affected are at least: - Windows and related OS - Linux when using FAT or

RE: Wget 1.8.2 bug

2003-10-20 Thread Sergey Vasilevsky
PROTECTED] Sent: Friday, October 17, 2003 7:18 PM To: Tony Lewis Cc: Wget List Subject: Re: Wget 1.8.2 bug Tony Lewis [EMAIL PROTECTED] writes: Hrvoje Niksic wrote: Incidentally, Wget is not the only browser that has a problem with that. For me, Mozilla is simply showing the source

Re: Wget 1.8.2 bug

2003-10-17 Thread Hrvoje Niksic
??? ?? [EMAIL PROTECTED] writes: I've seen pages that do that kind of redirections, but Wget seems to follow them, for me. Do you have an example I could try? [EMAIL PROTECTED]:~/ /usr/local/bin/wget -U All.by -np -r -N -nH --header=Accept-Charset: cp1251, windows-1251, win,

Re: Wget 1.8.2 bug

2003-10-17 Thread Tony Lewis
Hrvoje Niksic wrote: Incidentally, Wget is not the only browser that has a problem with that. For me, Mozilla is simply showing the source of http://www.minskshop.by/cgi-bin/shop.cgi?id=1cookie=set, because the returned content-type is text/plain. On the other hand, Internet Explorer will

Re: Wget 1.8.2 bug

2003-10-17 Thread Hrvoje Niksic
Tony Lewis [EMAIL PROTECTED] writes: Hrvoje Niksic wrote: Incidentally, Wget is not the only browser that has a problem with that. For me, Mozilla is simply showing the source of http://www.minskshop.by/cgi-bin/shop.cgi?id=1cookie=set, because the returned content-type is text/plain. On

Wget 1.8.2 bug

2003-10-14 Thread Sergey Vasilevsky
I use wget 1.8.2. When I try recursive download site site.com where site.com/ first page redirect to site.com/xxx.html that have first link in the page to site.com/ then Wget download only xxx.html and stop. Other links from xxx.html not followed!

Re: Wget 1.8.2 bug

2003-10-14 Thread Hrvoje Niksic
Sergey Vasilevsky [EMAIL PROTECTED] writes: I use wget 1.8.2. When I try recursive download site site.com where site.com/ first page redirect to site.com/xxx.html that have first link in the page to site.com/ then Wget download only xxx.html and stop. Other links from xxx.html not followed!

bug in 1.8.2 with

2003-10-14 Thread Noèl Köthe
Hello, which this download you will get a segfault. wget --passive-ftp --limit-rate 32k -r -nc -l 50 \ -X */binary-alpha,*/binary-powerpc,*/source,*/incoming \ -R alpha.deb,powerpc.deb,diff.gz,.dsc,.orig.tar.gz \ ftp://ftp.gwdg.de/pub/x11/kde/stable/3.1.4/Debian Philip Stadermann [EMAIL

Re: bug in 1.8.2 with

2003-10-14 Thread Hrvoje Niksic
You're right -- that code was broken. Thanks for the patch; I've now applied it to CVS with the following ChangeLog entry: 2003-10-15 Philip Stadermann [EMAIL PROTECTED] * ftp.c (ftp_retrieve_glob): Correctly loop through the list whose elements might have been deleted.

Re: subtle bug? or opportunity of avoiding multiple nested directories

2003-10-10 Thread Hrvoje Niksic
Stephen Hewitt [EMAIL PROTECTED] writes: Attempting to mirror a particular web site, with wget 1.8.1, I got many nested directories like .../images/images/images/images etc For example the log file ended like this: [...] Thanks for the detailed report and for taking the time to find the

RE: Bug in Windows binary?

2003-10-06 Thread Herold Heiko
From: Gisle Vanem [mailto:[EMAIL PROTECTED] Jens Rösner [EMAIL PROTECTED] said: ... I assume Heiko didn't notice it because he doesn't have that function in his kernel32.dll. Heiko and Hrvoje, will you correct this ASAP? --gv Probably. Currently I'm compiling and testing on NT 4.0

Re: Bug in Windows binary?

2003-10-05 Thread Gisle Vanem
and the output was exactly the same. I then tested wget 1.9 beta 2003/09/18 (earlier build!) from the same place and it works smoothly. Can anyone reproduce this bug? Yes, but the MSVC version crashed on my machine. But I've found the cause caused by my recent change :( A simple case of wrong

Re: Bug in Windows binary?

2003-10-05 Thread Hrvoje Niksic
Gisle Vanem [EMAIL PROTECTED] writes: --- mswindows.c.org Mon Sep 29 11:46:06 2003 +++ mswindows.c Sun Oct 05 17:34:48 2003 @@ -306,7 +306,7 @@ DWORD set_sleep_mode (DWORD mode) { HMODULE mod = LoadLibrary (kernel32.dll); - DWORD (*_SetThreadExecutionState) (DWORD) = NULL; +

BUG in --timeout (exit status)

2003-10-02 Thread Manfred Schwarb
Hi, doing the following: # /tmp/wget-1.9-beta3/src/wget -r --timeout=5 --tries=1 http://weather.cod.edu/digatmos/syn/ --11:33:16-- http://weather.cod.edu/digatmos/syn/ = `weather.cod.edu/digatmos/syn/index.html' Resolving weather.cod.edu... 192.203.136.228 Connecting to

Re: BUG in --timeout (exit status)

2003-10-02 Thread Hrvoje Niksic
This problem is not specific to timeouts, but to recursive download (-r). When downloading recursively, Wget expects some of the specified downloads to fail and does not propagate that failure to the code that sets the exit status. This unfortunately includes the first download, which should

Re: BUG in --timeout (exit status)

2003-10-02 Thread Manfred Schwarb
OK, I see. But I do not agree. And I don't think it is a good idea to treat the first download special. In my opinion, exit status 0 means everything during the whole retrieval went OK. My prefered solution would be to set the final exit status to the highest exit status of all individual

Re: dificulty with Debian wget bug 137989 patch

2003-09-30 Thread Hrvoje Niksic
on many platforms that Wget supports. The issue will likely be addressed in 1.10. Having said that: I tried the patch Debian bug report 137989 and didnt work. Can anybody explain: 1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one wget-1.8.2 ? You don't. Just enter

dificulty with Debian wget bug 137989 patch

2003-09-29 Thread jayme
I tried the patch Debian bug report 137989 and didnt work. Can anybody explain: 1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one wget-1.8.2 ? 2 - why after compilation the wget still cant download the file 2GB ? note : I cut the patch for debian use ( the first

wget bug

2003-09-26 Thread Jack Pavlovsky
It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg -- The human knowledge belongs to the world

Re: wget bug

2003-09-26 Thread DervishD
Hi Jack :) * Jack Pavlovsky [EMAIL PROTECTED] dixit: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg

Re: wget bug

2003-09-26 Thread Hrvoje Niksic
Jack Pavlovsky [EMAIL PROTECTED] writes: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg Thanks for the report

bug maybe?

2003-09-23 Thread Randy Paries
Not sure if this is a bug or not. i can not get a file over 2GB (i get a MAX file Exceeded error message) this is on a redhat 9 box. GNU Wget 1.8.2, Thanks Randy

Re: bug maybe?

2003-09-23 Thread Hrvoje Niksic
Randy Paries [EMAIL PROTECTED] writes: Not sure if this is a bug or not. I guess it could be called a bug, although it's no simple oversight. Wget currently doesn't support large files.

RE: bug maybe?

2003-09-23 Thread Matt Pease
how do I get off this list? I tried a few times before got no response from the server. thank you- Matt -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Tuesday, September 23, 2003 8:53 PM To: Randy Paries Cc: [EMAIL PROTECTED] Subject: Re: bug maybe

<    1   2   3   4   5   6   7   >