Re: Wget 1.10.2 bug

2006-03-08 Thread Steven M. Schweda
> It seems to me that the -O option has wget touching the file > which wget then detects. Close enough. With "-O", Wget opens the output file before it does any transfers, so when the program gets serious about the transfer, the file will exist, and that will confuse the "-nc" processing.

Wget 1.10.2 bug

2006-03-08 Thread cerise
Running this command: rm *.jpg ; wget -O usscole_90.jpg -nc --random-wait --referer=http://www.pianoladynancy.com/recovery_usscole.htm -- http://www.pianoladynancy.com/images/usscole_90.jpg generates the error: File `usscole_90.jpg' already there; not retrieving. However: rm *.jp

Possible bug

2006-03-08 Thread Lawrence E Schwartz
Hello,   Sometimes passwords contain @’s.  When they do, it seems to cause wget problems if the URL has the password encoded in it (for example, ftp://username:[EMAIL PROTECTED]@/directory).   The same sort of URL encoding works fine in wput.   Thank you for the fine software,   Lar

Re: Bug in TOLOWER macro when STANDALONE (?)

2006-03-06 Thread Hrvoje Niksic
"Beni Serfaty" <[EMAIL PROTECTED]> writes: > I Think I found a bug when STANDALONE is defined on hash.c > I hope I'm not missing something here... Good catch, thanks. I've applied a slightly different fix, appended below. By the way, are you using hash.c in

Bug in TOLOWER macro when STANDALONE (?)

2006-03-06 Thread Beni Serfaty
I Think I found a bug when STANDALONE is defined on hash.cI hope I'm not missing something here...(Please cc me the replies)@@ -63,7 +63,7 @@   if not enough memory */  # define xfree free # define countof(x) (sizeof (x) / sizeof ((x)[0]))-# define TOLOWER(x

wget bug: doesn't CWD after ftp failure

2006-03-05 Thread Nate Eldredge
Hi folks, I think I have found a bug in wget where it fails to change the working directory when retrying a failed ftp transaction. This is wget 1.10.2 on FreeBSD-6.0/amd64. I was trying to use wget to get files from a broken ftp server which occasionally sends garbled responses, causing

Re: Bug? -k not compatible with -O

2006-03-02 Thread Greg McCann
Steven M. Schweda antinode.org> writes: > > [...] wget version 1.9.1 > >You might try it with the current version (1.10.2). > > http://www.gnu.org/software/wget/wget.html > Oh, man - I can't believe I missed that. All better now! Thank you. Greg

Re: Bug? -k not compatible with -O

2006-03-02 Thread Steven M. Schweda
> [...] wget version 1.9.1 You might try it with the current version (1.10.2). http://www.gnu.org/software/wget/wget.html Steven M. Schweda (+1) 651-699-9818 382 South Warwick Street[EM

Bug? -k not compatible with -O

2006-03-02 Thread Greg McCann
option. However, when it goes to convert the links, as specified by the -k option, it looks for the default output filename "processcandquicksearch" rather than the filename that I specified with the -O option. This seems to be a bug, though I can work around it with... wget -k mv (Note: I am using wget version 1.9.1) Best regards, Greg McCann

BUG: timestamping and output file do not work together

2006-02-16 Thread Martin Kos
hi i've just posted my comments on the mailinglist [1]. wget doesn't behave the right way if i use the out --output-document option and --timestamping together. wget tries to compare the url-file with the original file instead with the --output-document file. why i got to this problem was be

Buffer overflow bug in base64_encode

2006-01-06 Thread rick
Hello all, I discovered a buffer overflow bug in the base64_encode() function, located at line 1905 in file src\utils.c. Note that this bug is in the latest version of the program (version 1.10.2) The bug appears to be that the function is assuming that the input data is a size that is an even

Re: Bug: -x with -O

2005-12-15 Thread Frank McCown
wget -x -O images/logo.gif http://www.google.co.uk/intl/en_uk/images/logo.gif It worked for me. Try it after "rm -rf images". That was why it worked... I had an images directory already created. Should have deleted it before I tried. Frank

Re: Bug: -x with -O

2005-12-14 Thread Steven M. Schweda
>From Frank McCown: > wget -x -O images/logo.gif > http://www.google.co.uk/intl/en_uk/images/logo.gif > > It worked for me. Try it after "rm -rf images". alp $ wget -x http://alp/test.html -O testxxx/test.html testxxx/test.html: no such file or directory alp $ wget -x -O testxxx/test.html h

Re: Bug: -x with -O

2005-12-14 Thread Steven M. Schweda
I wouldn't call it a bug. While it may not be well documented (which would not be unusual), "-x" affects URL-derived directories, not user-specified directories. Presumably Wget could be modified to handle this, but my initial reaction is that it's not unreasonabl

Re: Bug: -x with -O

2005-12-14 Thread Frank McCown
Chris, I think the problem is you don't have the URL last. Try this: wget -x -O images/logo.gif http://www.google.co.uk/intl/en_uk/images/logo.gif It worked for me. Frank Chris Hills wrote: Hi Using wget-1.10.2. Example command:- $ wget -x http://www.google.co.uk/intl/en_uk/images/log

Bug: -x with -O

2005-12-14 Thread Chris Hills
Hi Using wget-1.10.2. Example command:- $ wget -x http://www.google.co.uk/intl/en_uk/images/logo.gif -O images/logo.gif images/logo.gif: No such file or directory wget should created the directory images/. wget --help shows:- -x, --force-directoriesforce creation of directories.

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Steven M. Schweda
From: Hrvoje Niksic > [...] On Unix-like FTP servers, the two methods would > be equivalent. Right. So I resisted temptation, and kept the two-step CWD method in my code for only a VMS FTP server. My hope was that some one would look at the method, say "That's a good idea", and change the "

Re: wget BUG: ftp file retrieval

2005-11-26 Thread Hrvoje Niksic
[EMAIL PROTECTED] (Steven M. Schweda) writes: >> and adding it fixed many problems with FTP servers that log you in >> a non-/ working directory. > > Which of those problems would _not_ be fixed by my two-step CWD for > a relative path? That is: [...] That should work too. On Unix-like FTP serv

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic > Prepending is already there, Yes, it certainly is, which is why I had to disable it in my code for VMS FTP servers. > and adding it fixed many problems with > FTP servers that log you in a non-/ working directory. Which of those problems would _not_ be fixed by my t

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Daniel Stenberg <[EMAIL PROTECTED]> writes: > On Fri, 25 Nov 2005, Steven M. Schweda wrote: > >> Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to >> those paths. > > I agree. What good would prepending do? Prepending is already there, and adding it fixed many problems with FTP

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Daniel Stenberg
On Fri, 25 Nov 2005, Steven M. Schweda wrote: Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. I agree. What good would prepending do? It will most definately add problems such as those Steven describes. -- -=- Daniel Stenberg -=- http://daniel.haxx

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Steven M. Schweda
From: Hrvoje Niksic > Also don't [forget to] prepend the necessary [...] $CWD > to those paths. Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those paths. As you might recall from my changes for VMS FTP servers (if you had ever looked at them), this scheme causes no en

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Hrvoje Niksic <[EMAIL PROTECTED]> writes: > That might work. Also don't prepend the necessary prepending of $CWD > to those paths. Oops, I meant "don't forget to prepend ...".

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Mauro Tortonesi <[EMAIL PROTECTED]> writes: > Hrvoje Niksic wrote: >> Arne Caspari <[EMAIL PROTECTED]> writes: >> >> I believe that CWD is mandated by the FTP specification, but you're >> also right that Wget should try both variants. > > i agree. perhaps when retrieving file A/B/F.X we should try

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Thank you all for your very fast response. As a further note: When this error occurs, wget bails out with the following error message: "No such directory foo/bar". I think it should instead be "Could not access foo/bar: Permission denied" or similar in such a situation. /Arne Mauro Tortones

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Mauro Tortonesi
Hrvoje Niksic wrote: Arne Caspari <[EMAIL PROTECTED]> writes: I believe that CWD is mandated by the FTP specification, but you're also right that Wget should try both variants. i agree. perhaps when retrieving file A/B/F.X we should try to use: GET A/B/F.X first, then: CWD A/B GET F.X if t

Re: wget BUG: ftp file retrieval

2005-11-25 Thread Hrvoje Niksic
Arne Caspari <[EMAIL PROTECTED]> writes: > When called like: > wget user:[EMAIL PROTECTED]/foo/bar/file.tgz > > and foo or bar is a read/execute protected directory while file.tgz is > user-readable, wget fails to retrieve the file because it tries to CWD > into the directory first. > > I think th

wget BUG: ftp file retrieval

2005-11-25 Thread Arne Caspari
Hello, current wget seems to have the following bug in the ftp retrieval code: When called like: wget user:[EMAIL PROTECTED]/foo/bar/file.tgz and foo or bar is a read/execute protected directory while file.tgz is user-readable, wget fails to retrieve the file because it tries to CWD into the

Bug? page requisites are not downloaded when output to stdout

2005-11-20 Thread Mark Pors
://www.kpn.com/ Looks like a bug? Cheers, Mark -- Pors BV Internet Projects

Re: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Hrvoje Niksic
"Schatzman, James (Mission Systems)" <[EMAIL PROTECTED]> writes: > I have double checked the wget documentation. There is no mention of > the "https_proxy" parameter. The manual and sample wgetrc that are > provided list http_proxy and ftp_proxy - that is all.

RE: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
This is indeed the solution. I have double checked the wget documentation. There is no mention of the "https_proxy" parameter. The manual and sample wgetrc that are provided list http_proxy and ftp_proxy - that is all. Apparently, the bug is with the documentation, not the applicat

Re: FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Hrvoje Niksic
t appears that the fix reported in the 1.10 release > did not take. Any suggestions? The bug referred to in the release notes manifested itself differently: Wget would connect to the proxy server, and request the https URL using GET. The proxies (correctly) refused to obey this order, as it would pretty much defeat the purpose of using SSL.

WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
According to the wget release notes for 1.10 "*** Talking to SSL/TLS servers over proxies now actually works. Previous versions of Wget erroneously sent GET requests for https URLs. Wget 1.10 utilizes the CONNECT method designed for this purpose." However, I have tried versions 1.10, 1.10.1, and

FW: WGET SSL/TLS bug not fixed?

2005-11-15 Thread Schatzman, James \(Mission Systems\)
According to the wget release notes for 1.10 "*** Talking to SSL/TLS servers over proxies now actually works. Previous versions of Wget erroneously sent GET requests for https URLs. Wget 1.10 utilizes the CONNECT method designed for this purpose." However, I have tried versions 1.10, 1.10.1, and 1

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Hrvoje Niksic
"Jean-Marc MOLINA" <[EMAIL PROTECTED]> writes: > Hrvoje Niksic wrote: >> More precisely, it doesn't use the file name advertised by the >> Content-Disposition header. That is because Wget decides on the file >> name it will use based on the URL used, *before* the headers are >> downloaded. This

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Jean-Marc MOLINA
Tony Lewis wrote: > The --convert-links option changes the website path to a local file > system path. That is, it changes the directory, not the file name. Thanks I didn't understand it that way. > IMO, your suggestion has merit, but it would require wget to maintain > a list of MIME types and c

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Jean-Marc MOLINA
Hrvoje Niksic wrote: > More precisely, it doesn't use the file name advertised by the > Content-Disposition header. That is because Wget decides on the file > name it will use based on the URL used, *before* the headers are > downloaded. This unfortunate design decision is the cause of all > thes

RE: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Tony Lewis
Jean-Marc MOLINA wrote: > For example if a PNG image is generated using a "gen_png_image.php" PHP > script, I think wget should be able to download it if the option > "--page-requisites" is used, because it's part of the page and it's not > an external resource, get its MIME type, "image/png", and

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Hrvoje Niksic
"Jean-Marc MOLINA" <[EMAIL PROTECTED]> writes: > As I don't know anything about wget sources, I can't tell how it > innerworks but I guess it doesn't check the MIME types of resources > linked from the "src" attribute of a "img" element

Re: bug retrieving embedded images with --page-requisites

2005-11-09 Thread Jean-Marc MOLINA
ext/MIME mappings. So I removed the ".php to text/html" and got a nice PNG image instead. I don't really know how to force it not to rename the script but it doesn't really matter. As I don't know anything about wget sources, I can't tell how it innerworks but I gues

Bug? Failure of PC version

2005-10-23 Thread joseph blough
I am running a PC version of wget.   === C:\> wget --versionGNU Wget 1.9 Copyright (C) 2003 Free Software Foundation, Inc.This program is distributed in the hope that it will be useful,but WITHOUT ANY WARRANTY; without even the implied warranty ofMERCHANT

A bug or suggestion

2005-10-14 Thread Conrado Miranda
I saw that the option "-k, --convert-links" make the links on the root directory, not at the directory you down the pages. For example: if I download a page that the url is www.pageexample.com, the pages I download goes into there. But if i use that option, in the pages the links will link to the r

Re: bug in wget windows

2005-10-14 Thread Mauro Tortonesi
Tobias Koeck wrote: done. ==> PORT ... done.==> RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ <=> ] -673,009,664 113,23K/s Assertion failed: bytes >= 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way.

bug in wget windows

2005-10-14 Thread Tobias Koeck
done. ==> PORT ... done.==> RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done. [ <=> ] -673,009,664 113,23K/s Assertion failed: bytes >= 0, file retr.c, line 292 This application has requested the Runtime to terminate it in an unusual way. Please contact the

bug retrieving embedded images with --page-requisites

2005-10-05 Thread Gavin Sherlock
Hi, The following seems to not be expected behavior: wget --page-requisites --no-clobber --no-directories --no-host- directories --convert-links http://www.candidagenome.org/cgi-bin/ locus.pl?locus=HWP1 Two of the images on that page do not get downloaded, and then the links within the pag

a bug about wget

2005-10-04 Thread baidu baidu
That is, there is HTML like this: Click the following to go to the http://www.something.com/junk.asp?thepageIwant=2";;>next page. What I need is for wget to understand that stuff following an "?" in a URL indicates that it's a distinctly different page, and it should go recursively retrieve

wget bug

2005-10-03 Thread Michael C. Haller
Begin forwarded message: From: [EMAIL PROTECTED] Date: October 4, 2005 4:36:09 AM GMT+02:00 To: [EMAIL PROTECTED] Subject: failure notice Hi. This is the qmail-send program at sunsite.dk. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent erro

RE: possible bug/addition to wget

2005-10-03 Thread Tony Lewis
's probably not what you're looking for. If you want to have a go at this, look for opt.timestamping in ftp.c. Hope that helps! -Original Message- From: bob stephens [contr] [mailto:[EMAIL PROTECTED] Sent: Friday, September 30, 2005 10:06 AM To: [EMAIL PROTECTED] Subject: possible

possible bug/addition to wget

2005-10-03 Thread bob stephens [contr]
Hi WGet folks, This isnt really a bug I found in the operation of wget, but I think it is a functionality problem. I wonder if you can help me. I would like to use wget to mirror an ftp site - this step seems easy. BUT, I would like to set it up so that the files on my end are un- gzipped

Re: Bug rpt

2005-09-20 Thread Alain Bench
Hello Hrvoje! On Tuesday, September 20, 2005 at 12:50:41 AM +0200, Hrvoje Niksic wrote: > "HonzaCh" <[EMAIL PROTECTED]> writes: >> the thousand separator (space according to my local settings) >> displays as "á" (character code 0xA0, see attch.) > Wget obtains the thousand separator from the ope

Re: Bug rpt

2005-09-20 Thread Hrvoje Niksic
"HonzaCh" <[EMAIL PROTECTED]> writes: >>> My localeconv()->thousands_sep (as well as many other struct >>> members) reveals to empty string ("") (MSVC6.0). >> >> How do you know? I mean, what program did you use to check this? > > My quick'n'dirty one. See the source below. Your source neglects

Re: Bug rpt

2005-09-19 Thread Hrvoje Niksic
"HonzaCh" <[EMAIL PROTECTED]> writes: > Latest version (1.10.1) turns out an UI bug: the thousand separator > (space according to my local settings) displays as "á" (character > code 0xA0, see attch.) > > Although it does not affect the primary functio

Bug rpt

2005-09-19 Thread HonzaCh
Latest version (1.10.1) turns out an UI bug: the thousand separator (space according to my local settings) displays as "á" (character code 0xA0, see attch.) Although it does not affect the primary function of WGET, it looks quite ugly. Env.: Win2k Pro/Czech (CP852 for console apps,

Bug with -N used with -O in wget 1.10.1

2005-09-05 Thread John Frear
Hello! I'm writing because I've found a bug in the current version of wget (1.10.1). I've tried to fix it but it has proven to be too much for me! The bug is this: If you use -N and -O together, wget does not behave properly. Wget will always decide to download the r

Re: openssl server renogiation bug in wget

2005-08-26 Thread Hrvoje Niksic
Daniel Stenberg <[EMAIL PROTECTED]> writes: > On Fri, 26 Aug 2005, Hrvoje Niksic wrote: > >> + /* The OpenSSL library can handle renegotiations automatically, so >> + tell it to do so. */ >> + SSL_CTX_set_mode (ssl_ctx, SSL_MODE_AUTO_RETRY); >> + > > Just wanted to make sure that you are aw

Re: openssl server renogiation bug in wget

2005-08-26 Thread Daniel Stenberg
On Fri, 26 Aug 2005, Hrvoje Niksic wrote: + /* The OpenSSL library can handle renegotiations automatically, so + tell it to do so. */ + SSL_CTX_set_mode (ssl_ctx, SSL_MODE_AUTO_RETRY); + Just wanted to make sure that you are aware that this option is only available in OpenSSL 0.9.6 or

Re: openssl server renogiation bug in wget

2005-08-26 Thread Hrvoje Niksic
Thanks for the report; I've applied this patch: 2005-08-26 Jeremy Shapiro <[EMAIL PROTECTED]> * openssl.c (ssl_init): Set SSL_MODE_AUTO_RETRY. Index: openssl.c === --- openssl.c (revision 2063) +++ openssl.c (working c

openssl server renogiation bug in wget

2005-08-18 Thread Jeremy Shapiro
I believe I've encountered a bug in wget. When using https, if the server does a renegotiation handshake wget fails trying to peek for the application data. This occurs because wget does not set the openssl context mode SSL_MODE_AUTO_RETRY. When I added the line: SSL_CTX_set_mode (ss

bug tracker DB reloaded

2005-08-08 Thread Mauro Tortonesi
A few comments about the bug tracker saga... roundup is a really cool piece of software, but it seems that its developers don't really give a damn about backward compatibility and painless upgrades: http://roundup.sourceforge.net/doc-0.8/upgrading.html (well, I can't really blame

Malformed Command Line or Bug?

2005-08-03 Thread Jens
Hi wget list! Is it intended that wget -P"d:\goog" "http://www.google.com/"; works, whereas wget -P"d:\goog\" "http://www.google.com/"; does give the error message wget: missing URL ? Running wget 1.10 on Windows XP. Cheers Jens

wget bug when using proxy, https, & digest authentication

2005-07-21 Thread Corey Wright
all patches are against wget 1.10. please cc me on all responses as i am not subscribed to this list. FIRST BUG there is a bug in http.c. when connecting by way of proxy & https, if digest authentication is necessary, then the first connection attempt fails and we go to retry_with_auth.

[Fwd: Bug#319088: wget: don't rely on exactly one blank char between size and month]

2005-07-20 Thread Noèl Köthe
Hello, giuseppe wrote a patch for 1.10.1.beta1. Full report can be viewed here: http://bugs.debian.org/319088 Weitergeleitete Nachricht > Von: giuseppe bonacci <[EMAIL PROTECTED]> > Antwort an: giuseppe bonacci <[EMAIL PROTECTED]>, > [EMAIL PROTECTED] >

bug: downloading index.html twice

2005-07-07 Thread Frank McCown
Using Wget 1.10: wget -np -p -w 2 -r -l 0 http://www.cs.odu.edu/~mln/lazy/ results in http://www.cs.odu.edu/~mln/lazy/index.html being downloaded and saved twice. The final number of downloaded files is 1 file too many. From the wget output: -

Re: Bug? gettin file > 2 GB fails

2005-07-07 Thread Hrvoje Niksic
Jogchum Reitsma <[EMAIL PROTECTED]> writes: > I'm not sure it's a bug, but behaviour descibes below seems strange > to me, so I thought it was wise to report it: Upgrade to Wget 1.10 and the problem should go away. Earlier versions don't handle files larger than 2GB properly.

Bug? gettin file > 2 GB fails

2005-07-07 Thread Jogchum Reitsma
Hello, I'm not sure it's a bug, but behaviour descibes below seems strange to me, so I thought it was wise to report it: I'm trying to get a Suse 9.3 ISO from sunsite.informatik.rwth-aachen.de, a file that is 4383158 KB according to the FTP-listing. wget gets about 2.4 GB, th

Re: Bug

2005-07-07 Thread Hrvoje Niksic
> => `SUSE-9.3-Eval-DVD.iso' > Resolving chuck.ucs.indiana.edu... 156.56.247.193 > Connecting to chuck.ucs.indiana.edu[156.56.247.193]:21... connected. Please upgrade to Wget 1.10, which has this bug fixed.

Bug

2005-07-07 Thread Rodrigo Botafogo
[EMAIL PROTECTED]:~/Download/Linux> wget -c ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso --09:55:03--  ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso    => `SUSE-9.3-Eval-DVD.iso' Resolving chuck.ucs.indiana.edu... 156.56

Re: YNT: YNT: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes: >>It's already in the repository. > > I think you forget to put -DHAVE_SELECT statement > into makefile.src.mingw at > http://svn.dotsrc.org/repo/wget/branches/1.10/windows/. > > Am I right ? That was published in a separate patch -- specificall

YNT: YNT: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
Title: YNT: YNT: Mingw bug ? -Özgün İleti- Kimden: Hrvoje Niksic [mailto:[EMAIL PROTECTED]] Gönderilmiş: Cmt 02.07.2005 16:00 Kime: Abdurrahman ÇARKACIOĞLU Bilgi: wget@sunsite.dk Konu: Re: YNT: Mingw bug ? >> Will you consider the patch for future release of Wget. >It&

Re: YNT: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes: > Now, it works. Thanks a lot. > > But I want to understand what is going on ? Was it a bug ? It was a combination of two Wget bugs, one in actual code and other in MinGW configuration. Wget 1.9.1 and earlier used to close con

YNT: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
Title: YNT: Mingw bug ? Now, it works. Thanks a lot. But I want to understand what is going on ? Was it a bug ? Will you consider the patch for future release of Wget. -Özgün İleti- Kimden: Hrvoje Niksic [mailto:[EMAIL PROTECTED]] Gönderilmiş: Cmt 02.07.2005 14:06 Kime

Re: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
I believe this patch should fix the problem. Could you apply it and let me know if it fixes things for you? 2005-07-02 Hrvoje Niksic <[EMAIL PROTECTED]> * http.c (gethttp): Except for head_only, use skip_short_body to skip the non-20x error message before leaving gethttp. Ind

Re: Mingw bug ?

2005-07-02 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes: > Here are the results.. > ---request begin--- > GET /images/spk.ico HTTP/1.0 > Referer: http://www.spk.gov.tr/ > User-Agent: Wget/1.10 > Accept: */* > Host: www.spk.gov.tr > Connection: Keep-Alive > > ---request end--- > HTTP request sent, await

RE: Mingw bug ?

2005-07-02 Thread Abdurrahman ÇARKACIOĞLU
ndmailto:[EMAIL PROTECTED] Sent: Saturday, July 02, 2005 1:04 AM To: Abdurrahman ÇARKACIOĞLU Cc: wget@sunsite.dk Subject: Re: Mingw bug ? "A. Carkaci" <[EMAIL PROTECTED]> writes: > ---request begin--- > GET /images/spk.ico HTTP/1.0 > Referer: http://www.spk.gov.tr/

Re: Mingw bug ?

2005-07-02 Thread A . Carkaci
Hrvoje Niksic xemacs.org> writes: > > "A. Carkaci" spk.gov.tr> writes: > > > ---request begin--- > > GET /images/spk.ico HTTP/1.0 > > Referer: http://www.spk.gov.tr/ > > User-Agent: Wget/1.10 > > Accept: */* > > Host: www.spk.gov.tr > > Connection: Keep-Alive > > ---request end--- > > HTTP req

Re: Mingw bug ?

2005-07-01 Thread Hrvoje Niksic
"A. Carkaci" <[EMAIL PROTECTED]> writes: > ---request begin--- > GET /images/spk.ico HTTP/1.0 > Referer: http://www.spk.gov.tr/ > User-Agent: Wget/1.10 > Accept: */* > Host: www.spk.gov.tr > Connection: Keep-Alive > ---request end--- > HTTP request sent, awaiting response... > ---response begin--

Re: Mingw bug ?

2005-07-01 Thread A . Carkaci
Abdurrahman ÇARKACIOĞLU spk.gov.tr> writes: > > I succesfully compiled Wget 1.10 using mingw. Although Heiko Herold's wget 1.10 (original wget.exe I mean) > (from http://space.tin.it/computer/hherold/) succesfully download the following site, > my compiled wget (produced by mingw32-make) hangs

Re: Mingw bug ?

2005-07-01 Thread Hrvoje Niksic
Abdurrahman ÇARKACIOĞLU <[EMAIL PROTECTED]> writes: > I succesfully compiled Wget 1.10 using mingw. Although Heiko > Herold's wget 1.10 (original wget.exe I mean) (from > http://space.tin.it/computer/hherold/) succesfully download the > following site, my compiled wget (produced by mingw32-make) h

Mingw bug ?

2005-07-01 Thread Abdurrahman ÇARKACIOĞLU
I succesfully compiled Wget 1.10 using mingw. Although Heiko Herold's wget 1.10 (original wget.exe I mean) (from http://space.tin.it/computer/hherold/) succesfully download the following site, my compiled wget (produced by mingw32-make) hangs immediately forever. Any idea ? wget www.spk.gov.tr

Re: Bug report: option -nr

2005-06-30 Thread Hrvoje Niksic
Marc Niederwieser <[EMAIL PROTECTED]> writes: > option --mirror is described as > shortcut option equivalent to -r -N -l inf -nr. > but option "-nr" is not implemented. > I think you mean "--no-remove-listing". Thanks for the report, I've now fixed the --help text. 2005-07-01 Hrvoje Niksic <

Bug report: option -nr

2005-06-30 Thread Marc Niederwieser
Hi option --mirror is described as shortcut option equivalent to -r -N -l inf -nr. but option "-nr" is not implemented. I think you mean "--no-remove-listing". greetings Marc

Is there a bug in Wget 1.10 ?

2005-06-29 Thread Abdurrahman ÇARKACIOĞLU
Although wget 1.9.1 downloaded the following address, wget 1.10 fails (it hangs immediately forever). Is there a bug ? Address: www.spk.gov.tr

RE: ftp bug in 1.10

2005-06-27 Thread Herold Heiko
> From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] > the 64-bit "download sum", doesn't work for you. What does this > program print? > > #include > int > main (void) > { > __int64 n = 100I64; // ten billion, doesn't fit in 32 bits > printf("%I64\n", n); > return 0; > } > > It shou

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
David Fritz <[EMAIL PROTECTED]> writes: > "I64" is a size prefix akin to "ll". One still needs to specify the > argument type as in "%I64d" as with "%lld". That makes sense, thanks for the explanation!

Re: ftp bug in 1.10

2005-06-25 Thread David Fritz
"I64" is a size prefix akin to "ll". One still needs to specify the argument type as in "%I64d" as with "%lld".

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
Gisle Vanem <[EMAIL PROTECTED]> writes: > "Hrvoje Niksic" <[EMAIL PROTECTED]> wrote: > >> It should print a line containing "100". If it does, it means >> we're applying the wrong format. If it doesn't, then we must find >> another way of printing LARGE_INT quantities on Windows. > > I d

Re: ftp bug in 1.10

2005-06-25 Thread Gisle Vanem
"Hrvoje Niksic" <[EMAIL PROTECTED]> wrote: It should print a line containing "100". If it does, it means we're applying the wrong format. If it doesn't, then we must find another way of printing LARGE_INT quantities on Windows. I don't know what compiler OP used, but Wget only uses "

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
Hrvoje Niksic <[EMAIL PROTECTED]> writes: > This would indicate that the "%I64" format, which Wget uses to print > the 64-bit "download sum", doesn't work for you. For what it's worth, MSDN documents it: http://tinyurl.com/ysrh/. Could you be compiling Wget with an older C runtime that doesn't su

Re: ftp bug in 1.10

2005-06-25 Thread Hrvoje Niksic
Herold Heiko <[EMAIL PROTECTED]> writes: > Downloaded: bytes in 2 files > > Note missing number of bytes. This would indicate that the "%I64" format, which Wget uses to print the 64-bit "download sum", doesn't work for you. What does this program print? #include int main (void) { __int64 n

Re: wget bug report

2005-06-24 Thread Hrvoje Niksic
<[EMAIL PROTECTED]> writes: > Sorry for the crosspost, but the wget Web site is a little confusing > on the point of where to send bug reports/patches. Sorry about that. In this case, either address is fine, and we don't mind the crosspost. > After taking a look at i

Re: Bug handling session cookies

2005-06-24 Thread Hrvoje Niksic
"Mark Street" <[EMAIL PROTECTED]> writes: > Many thanks for the explanation and the patch. Yes, this patch > successfully resolves the problem for my particular test case. Thanks for testing it. It has been applied to the code and will be in Wget 1.10.1 and later.

Re: Bug handling session cookies

2005-06-24 Thread Mark Street
Hrvoje, Many thanks for the explanation and the patch. Yes, this patch successfully resolves the problem for my particular test case. Best regards, Mark Street.

Re: Bug handling session cookies

2005-06-24 Thread Hrvoje Niksic
es the problem by: * Making sure that path consistently gets prepended in all entry points to cookie code; * Removing the special logic from path_match. With that change your test case seems to work, and so do all the other tests I could think of. Please let me know if it works for you, and than

Bug handling session cookies

2005-06-24 Thread Mark Street
Hello folks, I'm running wget v1.10 compiled from source (tested on HP-UX and Linux). I am having problems handling session cookies. The idea is to request a web page which returns an ID number in a session cookie. All subsequent requests from the site must contain this session cookie. I'm usi

Re: Bug: wget cannot handle quote

2005-06-21 Thread Hrvoje Niksic
Will Kuhn <[EMAIL PROTECTED]> writes: > Apparentl wget does not handle single quote or double quote very well. > wget with the following arguments give error. > > wget > --user-agent='Mozilla/5.0' --cookies=off --header > 'Cookie: testbounce="testing"; > ih="b'!!!0T#8G(5A!!#c`#8HWs

BUG? using -O effectively disables -N

2005-06-21 Thread Dennis Kaarsemaker
-to-date wget will not re-download the page. Because this behaviour is unexpected and undocumented, I consider it a bug. -- Sincerely, Dennis Kaarsemaker signature.asc Description: This is a digitally signed message part

Re: Small bug in Wget manual page

2005-06-18 Thread Mauro Tortonesi
On Wednesday 15 June 2005 05:14 pm, Ulf Harnhammar wrote: > On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote: > > * faq.html > > ** "3.1 [..] > > Yes, starting from version 1.10, GNU Wget support files larger than 2GB." > > (should be "supports") > > ** "2.0 How I compile GNU Wget?" >

Re: Small bug in Wget manual page

2005-06-18 Thread Mauro Tortonesi
On Wednesday 15 June 2005 04:57 pm, Ulf Harnhammar wrote: > On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote: > > the web pages (including the documentation) on gnu.org have just been > > updated. > > Nice! I have found some broken links and strange grammar, though: > > * index.html:

Re: Small bug in Wget manual page

2005-06-15 Thread Ulf Harnhammar
On Wed, Jun 15, 2005 at 11:57:42PM +0200, Ulf Harnhammar wrote: > * faq.html > ** "3.1 [..] > Yes, starting from version 1.10, GNU Wget support files larger than 2GB." > (should be "supports") ** "2.0 How I compile GNU Wget?" (should be "How do I") // Ulf

Re: Small bug in Wget manual page

2005-06-15 Thread Ulf Harnhammar
On Wed, Jun 15, 2005 at 03:53:40PM -0500, Mauro Tortonesi wrote: > the web pages (including the documentation) on gnu.org have just been updated. Nice! I have found some broken links and strange grammar, though: * index.html: There are archives of the main GNU Wget list at ** fly.cc.fer.hr ** www

<    1   2   3   4   5   6   7   8   9   >