Hi!
When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta
1(with spider patch) command-line
switches wget does not pay attention to neither of them. It saves files in
the current directory. Wget v1.10.2 worked right.
Hope, this bug won't live long :).
When using -P or --directory-prefix in v1.11 Beta 1 and later v1.11 Beta
1(with spider patch) command-line
switches wget does not pay attention to neither of them. It saves files in
the current directory. Wget v1.10.2 worked right.
Such incorrent behaviour appeares only if server http
character
using --restrict-file-names=windows, but unfortunately this
does not fix the problem because the browser will un-escape
the URL and will still continue to look for a file with a colon
in it.
I am not sure of the best way to address this bug, because I
am not sure if it possible to escape
, then
it will create a link like this:
a href=Category:Fish title=Category:FishFish/a
Unfortunately, this is not a valid URL, because the browser
interprets the 'Category:' as the protocol Category, not
the local filename 'Category:'
I am not sure of the best way to address this bug, because I
am
Hello,
I was wondering what the status of this report is: has it even been received?
I've gotten no acknowledgement in two weeks or so.
Thanks,
Eugene
Thus spake Eugene Y. Vasserman on Mon, 20 Nov 2006:
From: Eugene Y. Vasserman [EMAIL PROTECTED]
Subject: Wget auth-md5 bug
Date: Mon, 20 Nov
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1.
First, the bin/wgetdir is defined as
wget -p -E -k --proxy=off -e robots=off --passive-ftp
-o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50
--waitretry=10 $@
The download command is
wgetdir http
Paul Bickerstaff [EMAIL PROTECTED] wrote in
news:[EMAIL PROTECTED]:
I'm using wget version GNU Wget 1.10.2 (Red Hat modified) on a fedora
core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b
on a WinXP laptop. Both display the same faulty behaviour which I don't
believe
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.
Hi,
I am using Wget is 1.10.2 unter Windows XP.
If I run (sensible data is replaced by ##):
wget --dont-remove-listing -b -o %temp%\0.log -P %temp%\result\
ftp://##.##.##.##/result/*0.*
Everything works fine. If I execute the same command again I get the following
error:
##/result/.listing
Hello,
At night I wanted to download new Fedora Core 6 DVD and wget downloaded it
all, closed connection and then tried to retry download. See below:
~ $ wget
ftp://ftp.uninett.no/pub/linux/Fedora/core/6/i386//iso/FC-6-i386-DVD.iso
--00:06:42--
From: Sebastian
Doctor, it hurts when I do this.
Don't do that.
Steven M. Schweda [EMAIL PROTECTED]
382 South Warwick Street(+1) 651-699-9818
Saint Paul MN 55105-2547
From dev:
I checked and the .wgetrc file has continue=on. Is there any way to
surpress the sending of getting by byte range? I will read through the
email and see if I can gather some more information that may be needed.
Remove continue=on from .wgetrc?
Consider:
-N, --timestamping
Hello,
this what happened when I tried to cache www.nytimes.com web page
with command wget -nd -p --delete-after http://www.nytimes.com
wget 1.10.2
With regards
Jan Pankiewicz
write(2, Connecting to 192.168.0.4:3128., 34Connecting to
192.168.0.4:3128... ) = 34
socket(PF_INET,
Hello,
this happened when I tried to cache www.nytimes.com web page
with command wget -nd -p --delete-after http://www.nytimes.com
With regards
Jan Pankiewicz
write(2, Connecting to 192.168.0.4:3128., 34Connecting to
192.168.0.4:3128... ) = 34
socket(PF_INET, SOCK_STREAM, IPPROTO_IP) = 3
Hello
I'm running --reject expecting that the files are skipped from downloading, but
instead they're downloaded then deleted. The whole point of using the option
was to avoid downloading a database which runs to over 12,000 files (before I
terminated wget!).
Is this correct behaviour? If it
when wget
already finds a local file with the same name and sends a range
request. Maybe there is some data structure that keeps getting added to
so it exhausts the memory on my test box which has 2GB. There were no
other programs running on the test box.
This may be a bug. To get around
wget
already finds a local file with the same name and sends a range
request. Maybe there is some data structure that keeps getting added to
so it exhausts the memory on my test box which has 2GB. There were no
other programs running on the test box.
This may be a bug. To get around
wget
already finds a local file with the same name and sends a range
request. Maybe there is some data structure that keeps getting added to
so it exhausts the memory on my test box which has 2GB. There were no
other programs running on the test box.
This may be a bug. To get around
be a bug, but finding it could be
difficult.
Steven M. Schweda [EMAIL PROTECTED]
382 South Warwick Street(+1) 651-699-9818
Saint Paul MN 55105-2547
Hello :)
Im not sure this is a bug or feature...
I cant down load files bigger than 2GB using wget.
Timofey.
p.s. my log=
///
wget -c
http://uk1x1.fileplanet.com/%5E1530224706/ftp1/052006
What operating system are you using? It may be a feature of your operating
system.
At 02:19 AM 10/14/2006, Tima Dronenko wrote:
Hello :)
Im not sure this is a bug or feature...
I cant down load files bigger than 2GB using wget.
Timofey.
p.s. my log
From: Tima Dronenko
Im not sure this is a bug or feature...
wget -V
If your wget version is before 1.10, it's a feature. At or after
1.10, it's a bug. (In some cases, the bug is in the server.)
Steven M
Im using Wget 1.10.2 (on windows )with the following parameters:
wget -r -l 1 -nd -p -T 10 --delete-after http://www.google.com/
wget crashes each time when trying to delete the files with --delete-after parameter.
But, with the same parameter for other websites like,
Reece ha scritto:
Found a bug (sort of).
When trying to get all the images in the directory below:
http://www.netstate.com/states/maps/images/
It gives 403 Forbidden errors for most of the images even after
setting the agent string to firefox's, and setting -e robots=off
After a packet
.
I meanwhile found, however, another new problem with time-stamping, which mainly
occurs in connection with a proxy-cache, I will report that in a new thread.
Same for a small problem with the SSL configuration.
thank you very much for the useful bug reports you keep sending us ;-)
--
Aequam
Found a bug (sort of).
When trying to get all the images in the directory below:
http://www.netstate.com/states/maps/images/
It gives 403 Forbidden errors for most of the images even after
setting the agent string to firefox's, and setting -e robots=off
After a packet capture, it appears
Jochen Roderburg ha scritto:
Zitat von Jochen Roderburg [EMAIL PROTECTED]:
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached
Zitat von Jochen Roderburg [EMAIL PROTECTED]:
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached below. But the other part is
(copia locale)
@@ -1,3 +1,9 @@
+2006-08-16 Mauro Tortonesi [EMAIL PROTECTED]
+
+ * http.c: Fixed bug which broke --continue feature. Now if -c is
+ given, http_loop sends a HEAD request to find out the destination
+ filename before resuming download.
+
2006-08-08 Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header, the best solution i could
Hrvoje Niksic ha scritto:
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header,
-feira, 10 de julho de 2006 07:04
Para: Tony Lewis
Cc: 'Junior + Suporte'; [EMAIL PROTECTED]
Assunto: Re: BUG
Tony Lewis ha scritto:
Run the command with -d and post the output here.
in this case, -S can provide more useful information than -d. be careful to
obfuscate passwords, though!!!
hi
).
See attached files for details. The script calling wget is WGET. There was no
.wgetrc file.
You probably know the bug described at:
http://www.mail-archive.com/wget@sunsite.dk/msg07686.html
Remove the two ./CLEAN commands in the script to test recursive re-download
with it.
I cannot
Hrvoje Niksic wrote:
Noèl Köthe [EMAIL PROTECTED] writes:
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Mauro, you will
Hello,
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Weitergeleitete Nachricht
[EMAIL PROTECTED]:~$ strace
Zitat von Hrvoje Niksic [EMAIL PROTECTED]:
Mauro, you will need to look at this one. Part of the problem is that
Wget decides to save to index.html.1 although -c is in use. That is
solved with the patch attached below. But the other part is that
hstat.local_file is a NULL pointer when
]
Subject: bug/feature request
To: [EMAIL PROTECTED]
Hi,
i´m not sure if that is a feature request or a bug.
Wget does not collect all page requisites of a given URL.
Many sites are referencing components of these sites in cascading style
sheets,
but wget does not collect these components
Hi,
i´m not sure if that is a feature request or a bug.
Wget does not collect all page requisites of a given URL.
Many sites are referencing components of these sites in cascading style sheets,
but wget does not collect these components as page requisites.
A example:
---
$ wget -q -p -k -nc -x
Daniel Richard G. ha scritto:
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
fixed, thanks.
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.tortonesi.com
University of Ferrara -
Linda Walsh ha scritto:
FYI:
On the manpage, where it talks about no-proxy, the manpage
says:
--no-proxy
Don't use proxies, even if the appropriate *_proxy environment
variable is defined.
For more information about the use of proxies with Wget,
one bug on Mac OS X
Dear Sir/Madam,
while I was trying to download using the command:
wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/
I got most of the files, but lost some of them.
I think I know where the problem is:
if the link is broken into two lines in the index.html:
PLecture
of an A tag.
Unrelated to this particular bug, please note that rfc1866 is not the
place to look for an up-to-date HTML specification. HTML has been
maintained by W3C for many years, so it's best to look there,
e.g. HTML 4.01 spec, or possibly XHTML.
Dear Sir/Madam,while I was trying to download using the command: wget -k -np -r -l inf -E http://dasher.wustl.edu/bio5476/I got most of the files, but lost some of them.I think I know where the problem is:if the link is broken into two lines in the index.html:PLecture 1 (Jan 17): Exploring
[mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 11, 2006 7:48 AMTo:
[EMAIL PROTECTED]Subject: I got one bug on Mac OS
X
Dear Sir/Madam,
while I was trying to download
using the command:
wget -k -np -r
-l inf -E http://dasher.wustl.edu/bio5476/
I got most of the files, but lost some of them
of line within the HREF attribute
of an A tag.
Tony
_
From: HUAZHANG GUO [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 11, 2006 7:48 AM
To: [EMAIL PROTECTED]
Subject: I got one bug on Mac OS X
Dear Sir/Madam,
while I was trying to download using the command:
wget -k -np
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
(build log excerpt)
+ make install DESTDIR=/tmp/wget--1.10.2.build/__dest__
cd src make CC='cc' CPPFLAGS='-D__EXTENSIONS__ -D_REENTRANT -Dsparc' ...
install.bin
Tony Lewis ha scritto:
Run the command with -d and post the output here.
in this case, -S can provide more useful information than -d. be careful to
obfuscate passwords, though!!!
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi
Dear,
I using wget to send login request to a site, when wget is saving the
cookies, the following error message appear:
Error in Set-Cookie, field `Path'Syntax error in Set-Cookie:
tu=661541|802400391
@TERRA.COM.BR; Expires=Thu, 14-Oct-2055 20:52:46 GMT; Path= at position 78.
Location:
Title: RE: BUG
Run the command with -d and post the output here.
Tony
_
From: Junior + Suporte [mailto:[EMAIL PROTECTED]]
Sent: Monday, July 03, 2006 2:00 PM
To: [EMAIL PROTECTED]
Subject: BUG
Dear,
I using wget to send login request
, con, prn). Maybe it is possible to query
the os about the currently active device names and rename the output files
if neccessary ?
I reproduced the bug with Win32 versions 1.5.dontremeber,
1.10.1 and 1.10.2. I did also test version 1.6 on Linux but it
was not affected.
That is since
Hello there,
I have to say that Wget is one of the most useful tools
out there(from my point of view of course). I'm using the
Win32 version of it to make life with XP little more bearable.
(faking Internet Explorer like mad, all over the place)
Well on to the thing i call a bug. The bug only
Hello,
We are using version 1.10.2 of wget under Ubuntu and Debian. So we have
many
scripts that get some images from a cacti site. These scripts ran perfectly
with version 1.9 of wget but they can not get image with version 1.10.2 of
wget.
Here you can find an example of our
[EMAIL PROTECTED] writes:
I discovered a buffer overflow bug in the base64_encode() function,
located at line 1905 in file src\utils.c. Note that this bug is in the
latest version of the program (version 1.10.2) The bug appears to be that
the function is assuming that the input data
FYI:
On the manpage, where it talks about no-proxy, the manpage
says:
--no-proxy
Don't use proxies, even if the appropriate *_proxy environment
variable is defined.
For more information about the use of proxies with Wget,
Hi,
I have tried out the wget alpha under Linux and found that the timestamping
option (which I usually have defined) does not work correctly.
First thing I saw, that on *every* download I got a line
Remote file is newer, retrieving.
in the output, even when there was no local file.
That
Hello there,
I have to say that Wget is one of the most useful tools
out there(from my point of view of course). I'm using the
Win32 version of it to make life with XP little more bearable.
(faking Internet Explorer like mad, all over the place)
Well on to the thing i call a bug. The bug only
Hello, i'm using wget 1.10.2 in Windows, the windows binary version, and it have a bug when downloading with -c and with a input file. If the first file of the list is the one to be continued, wget do it fine, if not, wgettry to download the files from the beginning, and it says
I'm using wget version 1.10.2.
If a try to download a nonexistant file with a command like like this
wget http://www.somehost.com/nonexistant.html
and the file does not exist, wget reports a 404 error and no file is
created.
However, if I specify the file where to place the output,
From: Eduardo M KALINOWSKI
wget http://www.somehost.com/nonexistant.html -O localfile.html
then file localfile.html will always be created, and will have length
of zero even if the remote file does not exist.
Because with -O, Wget opens the output file before it does any
network
Noèl Köthe wrote:
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
i wonder if it makes sense to add generic support for multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma:
headers. According to HTTP, a
duplicate header field is equivalent to a single header header with
multiple values joined using the , separator -- which the bug report
mentions.
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
i wonder if it makes sense to add generic support for
multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma: xxx --header=dontoverride,Pragma:
xxx2 someurl
That could be a problem if you
Herold Heiko wrote:
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
i wonder if it makes sense to add generic support for
multiple headers
in wget, for instance by extending the --header option like this:
wget --header=Pragma: xxx --header=dontoverride,Pragma:
xxx2 someurl
That could be
Hi,
I ran "wget -P /tmp/.test http://192.168.1.10" in SUSE system (SLES 9) and found that it saved the file in /tmp/_test.
This command works fine inRedHat, is it a bug?
wget version: wget-1.9.1-45.12
Thanks,
VanessaGet your ringtones, operator logos and picture messages from MSN Mobile.
yy :) [EMAIL PROTECTED] writes:
I ran wget -P /tmp/.test [1]http://192.168.1.10; in SUSE system (SLES 9)
and found that it saved the file in /tmp/_test.
This command works fine inRedHat, is it a bug?
I believe the bug is introduced by SuSE in an attempt to protect the
user. Try reporting
Hello,
a forwarded report from http://bugs.debian.org/366434
could this behaviour be added to the doc/manpage?
thx.
Package: wget
Version: 1.10.2-1
It's meaningful to have multiple 'Pragma:' headers within an http
request, but wget will silently issue only a single one of them if
they
Hello,
great program but I am having a problem with it.
The debug says:
The sizes do not match (local 16668160) -- retrieving.
ftp://ftp.invetech.com.au/Project%20Simon/bMX_Project_S_Invetech_Capability_%20Summary_a2.ppt
Windows:
15.8 MB (16,668,160 bytes)
cmd:
16,668,160
Jesse Cantara [EMAIL PROTECTED] writes:
A quick resolution to the problem is to use the -nH command line
argument, so that wget doesn't attempt to create that particular
directory. It appears as if the problem is with the creation of a
directory with a ':' in the name, which I cannot do
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is:
wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*Where Directory contains multiple subdirectories, all
- Original Message -
From: Hrvoje Niksic [EMAIL PROTECTED]
Date: Tuesday, March 28, 2006 7:23 pm
in progress.c line 880:
eta_hrs = (int)(eta / 3600, eta %= 3600);
eta_min = (int)(eta / 60, eta %= 60);
eta_sec = (int)(eta);
This is weird. Did you compile the code
Thomas Braby [EMAIL PROTECTED] writes:
eta_hrs = (int) (eta / 3600), eta %= 3600;
Yes that also works. The cast is needed on Windows x64 because eta is
a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a
warning is issued.
The same is the case on 32-bit Windows, and also
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log 111,000 hits
Gary Reysa wrote:
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log
Tested on: GNU Wget 1.9.1 (Win32)
Tested on: GNU Wget 1.10.2 (Win32)
Example: wget http://Check.Your.CPU.Usage/con;
Or wget http://Check.Your.CPU.Usage/con.txt;
You can also used aux, prn, con, lpt1, ltp2, com1, com2, ...
Regards,
fRoGGz ([EMAIL PROTECTED])
SecuBox Labs -
El 28/03/2006, a las 20:43, Tony Lewis escribió:
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line
as:
eta_hrs = eta / 3600; eta %=
Thomas got his version of progress.c
because it seems that the change has introduced the bug.
Thomas Braby [EMAIL PROTECTED] writes:
With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64
I was getting no ETA until late in the transfer, when I'd get things
like:
49:49:49 then 48:48:48 then 47:47:47 etc.
So I checked the eta value in seconds and it was correct, so
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use , here. Why not write the line as:
eta_hrs = eta / 3600; eta %= 3600;
This makes it much less likely that someone
Hello,
Sometimes passwords contain @s. When they do,
it seems to cause wget problems if the URL has the password encoded in it (for
example, ftp://username:[EMAIL PROTECTED]@/directory).
The same sort of URL encoding works fine in wput.
Thank you for the fine software,
Larry
Running this command:
rm *.jpg ; wget -O usscole_90.jpg -nc --random-wait
--referer=http://www.pianoladynancy.com/recovery_usscole.htm --
http://www.pianoladynancy.com/images/usscole_90.jpg
generates the error:
File `usscole_90.jpg' already there; not retrieving.
However:
rm
It seems to me that the -O option has wget touching the file
which wget then detects.
Close enough. With -O, Wget opens the output file before it does
any transfers, so when the program gets serious about the transfer, the
file will exist, and that will confuse the -nc processing.
I Think I found a bug when STANDALONE is defined on hash.cI hope I'm not missing something here...(Please cc me the replies)@@ -63,7 +63,7 @@ if not enough memory */
# define xfree free# define countof(x) (sizeof (x) / sizeof ((x)[0]))-# define TOLOWER(x) ('A' = (x) (x) = 'Z' ? (x) - 32 : (x
Beni Serfaty [EMAIL PROTECTED] writes:
I Think I found a bug when STANDALONE is defined on hash.c
I hope I'm not missing something here...
Good catch, thanks. I've applied a slightly different fix, appended
below.
By the way, are you using hash.c in a project? I'd like to hear if
you're
Hi folks,
I think I have found a bug in wget where it fails to change the working
directory when retrying a failed ftp transaction. This is wget 1.10.2 on
FreeBSD-6.0/amd64.
I was trying to use wget to get files from a broken ftp server which
occasionally sends garbled responses, causing
for
the default output filename processcandquicksearch rather than the filename
that I specified with the -O option.
This seems to be a bug, though I can work around it with...
wget -k blah blah blah
mv default filename my filename
(Note: I am using wget version 1.9.1)
Best regards,
Greg McCann
hi
i've just posted my comments on the mailinglist [1]. wget doesn't behave
the right way if i use the out --output-document option and
--timestamping together. wget tries to compare the url-file with the
original file instead with the --output-document file.
why i got to this problem was
Hello all,
I discovered a buffer overflow bug in the base64_encode() function,
located at line 1905 in file src\utils.c. Note that this bug is in the
latest version of the program (version 1.10.2) The bug appears to be that
the function is assuming that the input data is a size that is an even
[EMAIL PROTECTED] (Steven M. Schweda) writes:
and adding it fixed many problems with FTP servers that log you in
a non-/ working directory.
Which of those problems would _not_ be fixed by my two-step CWD for
a relative path? That is: [...]
That should work too. On Unix-like FTP servers,
From: Hrvoje Niksic
[...] On Unix-like FTP servers, the two methods would
be equivalent.
Right. So I resisted temptation, and kept the two-step CWD method in
my code for only a VMS FTP server. My hope was that some one would look
at the method, say That's a good idea, and change the if
Hello,
current wget seems to have the following bug in the ftp retrieval code:
When called like:
wget user:[EMAIL PROTECTED]/foo/bar/file.tgz
and foo or bar is a read/execute protected directory while file.tgz is
user-readable, wget fails to retrieve the file because it tries to CWD
Arne Caspari [EMAIL PROTECTED] writes:
When called like:
wget user:[EMAIL PROTECTED]/foo/bar/file.tgz
and foo or bar is a read/execute protected directory while file.tgz is
user-readable, wget fails to retrieve the file because it tries to CWD
into the directory first.
I think the correct
Hrvoje Niksic wrote:
Arne Caspari [EMAIL PROTECTED] writes:
I believe that CWD is mandated by the FTP specification, but you're
also right that Wget should try both variants.
i agree. perhaps when retrieving file A/B/F.X we should try to use:
GET A/B/F.X
first, then:
CWD A/B
GET F.X
if
Thank you all for your very fast response. As a further note: When this
error occurs, wget bails out with the following error message:
No such directory foo/bar.
I think it should instead be Could not access foo/bar: Permission
denied or similar in such a situation.
/Arne
Mauro Tortonesi
Mauro Tortonesi [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Arne Caspari [EMAIL PROTECTED] writes:
I believe that CWD is mandated by the FTP specification, but you're
also right that Wget should try both variants.
i agree. perhaps when retrieving file A/B/F.X we should try to use:
GET
Hrvoje Niksic [EMAIL PROTECTED] writes:
That might work. Also don't prepend the necessary prepending of $CWD
to those paths.
Oops, I meant don't forget to prepend
From: Hrvoje Niksic
Also don't [forget to] prepend the necessary [...] $CWD
to those paths.
Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to
those paths.
As you might recall from my changes for VMS FTP servers (if you had
ever looked at them), this scheme causes no end
On Fri, 25 Nov 2005, Steven M. Schweda wrote:
Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those
paths.
I agree. What good would prepending do? It will most definately add problems
such as those Steven describes.
--
-=- Daniel Stenberg -=-
From: Hrvoje Niksic
Prepending is already there,
Yes, it certainly is, which is why I had to disable it in my code for
VMS FTP servers.
and adding it fixed many problems with
FTP servers that log you in a non-/ working directory.
Which of those problems would _not_ be fixed by my
According to the wget release notes for 1.10
*** Talking to SSL/TLS servers over proxies now actually works.
Previous versions of Wget erroneously sent GET requests for https URLs.
Wget 1.10 utilizes the CONNECT method designed for this purpose.
However, I have tried versions 1.10, 1.10.1, and
101 - 200 of 678 matches
Mail list logo