-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Yes, that's what it means.
I'm not yet committed to doing this. I'd like to see first how many
mainstream servers will respect If-Modified-Since when given as part of
an HTTP/1.0 request (in comparison to how they respond when it's part of
an HTTP/1.1
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
vinothkumar raman wrote:
> We need to give out the time stamp the local file in the Request
> header for that we need to pass on the local file's time stamp from
> http_loop() to get_http() . The only way to pass on this without
> altering the signatur
This mean we should remove the previous HEAD request code and use
If-Modified-Since by default and have it to handle all the request and
store pages if it is not returning a 304 response
Is it so?
On Fri, Aug 29, 2008 at 11:06 PM, Micah Cowan <[EMAIL PROTECTED]> wrote:
>
> Follow-up Comment #4,
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Sir Vision wrote:
> Hello,
>
> enterring following command results in an error:
>
> --- command start ---
> c:\Downloads\wget_v1.11.3b>wget
> "ftp://ftp.mozilla.org/pub/mozilla.org/thunderbird/nightly/latest-mozilla1.8-l10n/";
> -P c:\Downloads\
> --
ok, thanks for your reply
We have a work-around in place now, but it doesnt scale very good.
Anyways, I'll start looking for another solution
Thanks!
Mark
On Sat, Mar 1, 2008 at 10:15 PM, Micah Cowan <[EMAIL PROTECTED]> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
>
>
> Mark Pors
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Mark Pors wrote:
> Hi,
>
> I posted this bug over two years ago:
> http://marc.info/?l=wget&m=113252747105716&w=4
>>From the release notes I see that this is still not resolved. Are
> there any plans to fix this any time soon?
I'm not sure that's a b
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
> Generally, if Wget considers a header to be in error (and hence
> ignores it), the user probably needs to know about that. After all,
> it could be the symptom of a Wget bug, or of an unimplemented
> extension the server gener
Micah Cowan <[EMAIL PROTECTED]> writes:
>> The new Wget flags empty Set-Cookie as a syntax error (but only
>> displays it in -d mode; possibly a bug).
>
> I'm not clear on exactly what's possibly a bug: do you mean the fact
> that Wget only calls attention to it in -d mode?
That's what I meant.
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
> Micah Cowan <[EMAIL PROTECTED]> writes:
>
>> I was able to reproduce the problem above in the release version of
>> Wget; however, it appears to be working fine in the current
>> development version of Wget, which is expected
Micah Cowan <[EMAIL PROTECTED]> writes:
> I was able to reproduce the problem above in the release version of
> Wget; however, it appears to be working fine in the current
> development version of Wget, which is expected to release soon as
> version 1.11.*
I think the old Wget crashed on empty Se
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Diego Campo wrote:
> Hi,
> I got a bug on wget when executing:
>
> wget -a log -x -O search/search-1.html --verbose --wait 3
> --limit-rate=20K --tries=3
> http://www.nepremicnine.net/nepremicninske_agencije.html?id_regije=1
>
> Segmentation fault
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Hrvoje Niksic wrote:
> In the long run, supporting something like IRL is surely the right
> thing to go for, but I have a feeling that we'll be stuck with the
> current messy URLs for quite some time to come. So Wget simply needs
> to adapt to the
Micah Cowan <[EMAIL PROTECTED]> writes:
> It is actually illegal to specify byte values outside the range of
> ASCII characters in a URL, but it has long been historical practice
> to do so anyway. In most cases, the intended meaning was one of the
> latin character sets (usually latin1), so Wget
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Brian Keck wrote:
> Hello,
>
> I'm wondering if I've found a bug in the excellent wget.
> I'm not asking for help, because it turned out not to be the reason
> one of my scripts was failing.
>
> The possible bug is in the derivation of the filename
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Josh Williams wrote:
> On 10/4/07, Brian Keck <[EMAIL PROTECTED]> wrote:
>> I would have sent a fix too, but after finding my way through http.c &
>> retr.c I got lost in url.c.
>
> You and me both. A lot of the code needs re-written.. there's a lot
On 10/4/07, Brian Keck <[EMAIL PROTECTED]> wrote:
> I would have sent a fix too, but after finding my way through http.c &
> retr.c I got lost in url.c.
You and me both. A lot of the code needs re-written.. there's a lot of
spaghetti code in there. I hope Micah chooses to do a complete
re-write fo
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
>
> On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
>
>>
>>> sprintf(filecopy, "\"%.2047s\"", file);
>>
>> This fix breaks the FTP protocol, making wget instantly stop working
>> with many conforming servers, but apparently st
On 7/15/07, Rich Cook <[EMAIL PROTECTED]> wrote:
I think you may well be correct. I am now unable to reproduce the
problem where the server does not recognize a filename unless I give
it quotes. In fact, as you say, the server ONLY recognizes filenames
WITHOUT quotes and quoting breaks it. I h
On Jul 13, 2007, at 12:29 PM, Micah Cowan wrote:
sprintf(filecopy, "\"%.2047s\"", file);
This fix breaks the FTP protocol, making wget instantly stop working
with many conforming servers, but apparently start working with yours;
the RFCs are very clear that the file name argument starts
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
> On OS X, if a filename on the FTP server contains spaces, and the remote
> copy of the file is newer than the local, then wget gets thrown into a
> loop of "No such file or directory" endlessly. I have changed the
> following in
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Mauro Tortonesi wrote:
> Micah Cowan ha scritto:
>> Update of bug #20323 (project wget):
>>
>> Status: Ready For Test => In
>> Progress
>> ___
>>
>> Follow-
Micah Cowan wrote:
Matthew Woehlke wrote:
Micah Cowan wrote:
...any reason to not CC bug updates here also/instead? That's how e.g.
kwrite does thing (also several other lists AFAIK), and seems to make
sense. This is 'bug-wget' after all :-).
It is; but it's also 'wget'.
Hmm, so it is; my ba
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Matthew Woehlke wrote:
> Micah Cowan wrote:
>> The wget-notify mailing list
>> (http://addictivecode.org/mailman/listinfo/wget-notify) will now also be
>> receiving notifications of bug updates from GNU Savannah, in addition to
>> subversion commits
Micah Cowan wrote:
The wget-notify mailing list
(http://addictivecode.org/mailman/listinfo/wget-notify) will now also be
receiving notifications of bug updates from GNU Savannah, in addition to
subversion commits.
...any reason to not CC bug updates here also/instead? That's how e.g.
kwrite d
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Steven M. Schweda wrote:
>>From :
>
>> [...]
>>char filecopy[2048];
>>if (file[0] != '"') {
>> sprintf(filecopy, "\"%.2047s\"", file);
>>} else {
>> strncpy(filecopy, file, 2047);
>>}
>> [...]
>> It should be:
>>
>> sp
>From :
> [...]
>char filecopy[2048];
>if (file[0] != '"') {
> sprintf(filecopy, "\"%.2047s\"", file);
>} else {
> strncpy(filecopy, file, 2047);
>}
> [...]
> It should be:
>
> sprintf(filecopy, "\"%.2045s\"", file);
> [...]
I'll admit to being old and grumpy, b
Thanks for the follow up. :-)
On Jul 5, 2007, at 3:52 PM, Micah Cowan wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)?
Or do I
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Bruso, John wrote:
> Please remove me from this list. thanks,
Nobody on this list has the ability to do this, unfortunately (Wget
maintainership is separate from the maintainers of this list). To
further confuse the issue, [EMAIL PROTECTED] is actua
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Rich Cook wrote:
> So forgive me for a newbie-never-even-lurked kind of question: will
> this fix make it into wget for other users (and for me in the future)?
> Or do I need to do more to make that happen, or...? Thanks!
Well, I need a chance to
So forgive me for a newbie-never-even-lurked kind of question: will
this fix make it into wget for other users (and for me in the
future)? Or do I need to do more to make that happen, or...? Thanks!
On Jul 5, 2007, at 12:52 PM, Hrvoje Niksic wrote:
Rich Cook <[EMAIL PROTECTED]> writes:
Rich Cook <[EMAIL PROTECTED]> writes:
> On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
>
>> Rich Cook <[EMAIL PROTECTED]> writes:
>>
>>> Trouble is, it's undocumented as to how to free the resulting
>>> string. Do I call free on it?
>>
>> Yes. "Freshly allocated with malloc" in the function d
Please remove me from this list. thanks,
John Bruso
From: Rich Cook [mailto:[EMAIL PROTECTED]
Sent: Thu 7/5/2007 12:30 PM
To: Hrvoje Niksic
Cc: Tony Lewis; [EMAIL PROTECTED]
Subject: Re: bug and "patch": blank spaces in filenames causes looping
On Jul 5, 2007, at 11:08 AM, Hrvoje Niksic wrote:
Rich Cook <[EMAIL PROTECTED]> writes:
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it?
Yes. "Freshly allocated with malloc" in the function documentation
was supposed to indicate how to free the s
Rich Cook <[EMAIL PROTECTED]> writes:
> Trouble is, it's undocumented as to how to free the resulting
> string. Do I call free on it?
Yes. "Freshly allocated with malloc" in the function documentation
was supposed to indicate how to free the string.
"Virden, Larry W." <[EMAIL PROTECTED]> writes:
> "Tony Lewis" <[EMAIL PROTECTED]> writes:
>
>> Wget has an `aprintf' utility function that allocates the result on
> the heap. Avoids both buffer overruns and
>> arbitrary limits on file name length.
>
> If it uses the heap, then doesn't that open
Trouble is, it's undocumented as to how to free the resulting
string. Do I call free on it? I'd use asprintf, but I'm afraid to
suggest that here as it may not be portable.
On Jul 5, 2007, at 10:45 AM, Hrvoje Niksic wrote:
"Tony Lewis" <[EMAIL PROTECTED]> writes:
There is a buffer overfl
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and
> arbitrary limits on file name length.
If it uses the heap, th
"Tony Lewis" <[EMAIL PROTECTED]> writes:
> There is a buffer overflow in the following line of the proposed code:
>
> sprintf(filecopy, "\"%.2047s\"", file);
Wget has an `aprintf' utility function that allocates the result on
the heap. Avoids both buffer overruns and arbitrary limits on fil
Good point, although it's only a POTENTIAL buffer overflow, and it's
limited to 2 bytes, so at least it's not exploitable. :-)
On Jul 5, 2007, at 9:05 AM, Tony Lewis wrote:
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, "\"%.2047s\"", file);
There is a buffer overflow in the following line of the proposed code:
sprintf(filecopy, "\"%.2047s\"", file);
It should be:
sprintf(filecopy, "\"%.2045s\"", file);
in order to leave room for the two quotes.
Tony
-Original Message-
From: Rich Cook [mailto:[EMAIL PROTECTED]
S
Matthias Vill schrieb:
> Mario Ander schrieb:
>> Hi everybody,
>>
>> I think there is a bug storing cookies with wget.
>>
>> See this command line:
>>
>> "C:\Programme\wget\wget" --user-agent="Opera/8.5 (X11;
>> U; en)" --no-check-certificate --keep-session-cookies
>> --save-cookies="cookie.txt" --
Mario Ander schrieb:
> Hi everybody,
>
> I think there is a bug storing cookies with wget.
>
> See this command line:
>
> "C:\Programme\wget\wget" --user-agent="Opera/8.5 (X11;
> U; en)" --no-check-certificate --keep-session-cookies
> --save-cookies="cookie.txt" --output-document=-
> --debug --o
A quick search at "http://www.mail-archive.com/wget@sunsite.dk/"; for
"-O" found:
http://www.mail-archive.com/wget@sunsite.dk/msg08746.html
http://www.mail-archive.com/wget@sunsite.dk/msg08748.html
The way "-O" is implemented, there are all kinds of things which are
incompatible
Juhana Sadeharju wrote:
Hello. Wget 1.10.2 has the following bug compared to version 1.9.1.
First, the bin/wgetdir is defined as
wget -p -E -k --proxy=off -e robots=off --passive-ftp
-o zlogwget`date +%Y%m%d%H%M%S` -r -l 0 -np -U Mozilla --tries=50
--waitretry=10 $@
The download command is
From: Sebastian
"Doctor, it hurts when I do this."
"Don't do that."
Steven M. Schweda [EMAIL PROTECTED]
382 South Warwick Street(+1) 651-699-9818
Saint Paul MN 55105-2547
Reece ha scritto:
Found a bug (sort of).
When trying to get all the images in the directory below:
http://www.netstate.com/states/maps/images/
It gives 403 Forbidden errors for most of the images even after
setting the agent string to firefox's, and setting -e robots=off
After a packet capture
Hi !
Maybe you can add this patch to your mainline-tree:
http://www.mail-archive.com/wget%40sunsite.dk/msg09142.html
Best regards
Marc Schoechlin
On Wed, Jul 26, 2006 at 07:26:45AM +0200, Marc Schoechlin wrote:
> Date: Wed, 26 Jul 2006 07:26:45 +0200
> From: Marc Schoechlin <[EMAIL PROTECTED]>
Daniel Richard G. ha scritto:
Hello,
The MAKEDEFS value in the top-level Makefile.in also needs to include
DESTDIR='$(DESTDIR)'.
fixed, thanks.
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.tortonesi.com
University of Ferrara - Dept
Tony Lewis ha scritto:
Run the command with -d and post the output here.
in this case, -S can provide more useful information than -d. be careful to
obfuscate passwords, though!!!
--
Aequam memento rebus in arduis servare mentem...
Mauro Tortonesi http://www.torto
Title: RE: BUG
Run the command with -d and post the output here.
Tony
_
From: Junior + Suporte [mailto:[EMAIL PROTECTED]]
Sent: Monday, July 03, 2006 2:00 PM
To: [EMAIL PROTECTED]
Subject: BUG
Dear,
I using wget to send
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] Behalf Of Þröstur
> Sent: Wednesday, June 21, 2006 4:35 PM
There have been some reports in the past but I don't think it has been acted
upon; one of the problems is that the list of names can be extended at will
(beside the standard comx, lptx,
From: Eduardo M KALINOWSKI
> wget http://www.somehost.com/nonexistant.html -O localfile.html
>
> then file "localfile.html" will always be created, and will have length
> of zero even if the remote file does not exist.
Because with "-O", Wget opens the output file before it does any
network a
"yy :)" <[EMAIL PROTECTED]> writes:
> I ran "wget -P /tmp/.test [1]http://192.168.1.10"; in SUSE system (SLES 9)
> and found that it saved the file in /tmp/_test.
> This command works fine inRedHat, is it a bug?
I believe the bug is introduced by SuSE in an attempt to "protect" the
user. Try rep
Thomas Braby <[EMAIL PROTECTED]> writes:
>> eta_hrs = (int) (eta / 3600), eta %= 3600;
>
> Yes that also works. The cast is needed on Windows x64 because eta is
> a wgint (which is 64-bit) but a regular int is 32-bit so otherwise a
> warning is issued.
The same is the case on 32-bit Windows, an
- Original Message -
From: Hrvoje Niksic <[EMAIL PROTECTED]>
Date: Tuesday, March 28, 2006 7:23 pm
> > in progress.c line 880:
> >
> >eta_hrs = (int)(eta / 3600, eta %= 3600);
> >eta_min = (int)(eta / 60, eta %= 60);
> >eta_sec = (int)(eta);
>
> This is weird. Did you compi
Gary Reysa wrote:
Hi,
I don't really know if this is a Wget bug, or some problem with my
website, but, either way, maybe you can help.
I have a web site ( www.BuildItSolar.com ) with perhaps a few hundred
pages (260MB of storage total). Someone did a Wget on my site, and
managed to log 111
El 29/03/2006, a las 14:39, Hrvoje Niksic escribió:
I can't see any good reason to use "," here. Why not write the line
as:
eta_hrs = eta / 3600; eta %= 3600;
Because that's not equivalent.
Well, it should be, because the comma operator has lower precedence
than the assignment operato
Greg Hurrell <[EMAIL PROTECTED]> writes:
> El 28/03/2006, a las 20:43, Tony Lewis escribió:
>
>> Hrvoje Niksic wrote:
>>
>>> The cast to int looks like someone was trying to remove a warning and
>>> botched operator precedence in the process.
>>
>> I can't see any good reason to use "," here. Why
El 28/03/2006, a las 20:43, Tony Lewis escribió:
Hrvoje Niksic wrote:
The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.
I can't see any good reason to use "," here. Why not write the line
as:
eta_hrs = eta / 3600; eta %
Hrvoje Niksic wrote:
> The cast to int looks like someone was trying to remove a warning and
> botched operator precedence in the process.
I can't see any good reason to use "," here. Why not write the line as:
eta_hrs = eta / 3600; eta %= 3600;
This makes it much less likely that someone
Thomas Braby <[EMAIL PROTECTED]> writes:
> With wget 1.10.2 compiled using Visual Studio 2005 for Windows XP x64
> I was getting no ETA until late in the transfer, when I'd get things
> like:
>
> 49:49:49 then 48:48:48 then 47:47:47 etc.
>
> So I checked the eta value in seconds and it was corre
"Beni Serfaty" <[EMAIL PROTECTED]> writes:
> I Think I found a bug when STANDALONE is defined on hash.c
> I hope I'm not missing something here...
Good catch, thanks. I've applied a slightly different fix, appended
below.
By the way, are you using hash.c in a project? I'd like to hear if
you'r
Steven M. Schweda antinode.org> writes:
> > [...] wget version 1.9.1
>
>You might try it with the current version (1.10.2).
>
> http://www.gnu.org/software/wget/wget.html
>
Oh, man - I can't believe I missed that. All better now! Thank you.
Greg
> [...] wget version 1.9.1
You might try it with the current version (1.10.2).
http://www.gnu.org/software/wget/wget.html
Steven M. Schweda (+1) 651-699-9818
382 South Warwick Street[EM
wget -x -O images/logo.gif
http://www.google.co.uk/intl/en_uk/images/logo.gif
It worked for me.
Try it after "rm -rf images".
That was why it worked... I had an images directory already created.
Should have deleted it before I tried.
Frank
>From Frank McCown:
> wget -x -O images/logo.gif
> http://www.google.co.uk/intl/en_uk/images/logo.gif
>
> It worked for me.
Try it after "rm -rf images".
alp $ wget -x http://alp/test.html -O testxxx/test.html
testxxx/test.html: no such file or directory
alp $ wget -x -O testxxx/test.html h
I wouldn't call it a bug. While it may not be well documented (which
would not be unusual), "-x" affects URL-derived directories, not
user-specified directories.
Presumably Wget could be modified to handle this, but my initial
reaction is that it's not unreasonable to demand that the fellow
Chris,
I think the problem is you don't have the URL last. Try this:
wget -x -O images/logo.gif
http://www.google.co.uk/intl/en_uk/images/logo.gif
It worked for me.
Frank
Chris Hills wrote:
Hi
Using wget-1.10.2.
Example command:-
$ wget -x http://www.google.co.uk/intl/en_uk/images/log
"Jean-Marc MOLINA" <[EMAIL PROTECTED]> writes:
> Hrvoje Niksic wrote:
>> More precisely, it doesn't use the file name advertised by the
>> Content-Disposition header. That is because Wget decides on the file
>> name it will use based on the URL used, *before* the headers are
>> downloaded. This
Tony Lewis wrote:
> The --convert-links option changes the website path to a local file
> system path. That is, it changes the directory, not the file name.
Thanks I didn't understand it that way.
> IMO, your suggestion has merit, but it would require wget to maintain
> a list of MIME types and c
Hrvoje Niksic wrote:
> More precisely, it doesn't use the file name advertised by the
> Content-Disposition header. That is because Wget decides on the file
> name it will use based on the URL used, *before* the headers are
> downloaded. This unfortunate design decision is the cause of all
> thes
Jean-Marc MOLINA wrote:
> For example if a PNG image is generated using a "gen_png_image.php" PHP
> script, I think wget should be able to download it if the option
> "--page-requisites" is used, because it's part of the page and it's not
> an external resource, get its MIME type, "image/png", and
"Jean-Marc MOLINA" <[EMAIL PROTECTED]> writes:
> As I don't know anything about wget sources, I can't tell how it
> innerworks but I guess it doesn't check the MIME types of resources
> linked from the "src" attribute of a "img" elements. And that would
> be a bug... And I think some kind of RFC o
Gavin Sherlock wrote:
> i.e. the image is generated on the fly from a script, which then
> essentially prints the image back to the browser with the correct
> mime type. While this is a non-standard way to include an image on a
> page, the --page-requisites are not fulfilled when retrieving this
>
Tobias Koeck wrote:
done.
==> PORT ... done.==> RETR SUSE-10.0-EvalDVD-i386-GM.iso ... done.
[ <=> ] -673,009,664 113,23K/s
Assertion failed: bytes >= 0, file retr.c, line 292
This application has requested the Runtime to terminate it in an unusual
way.
Hello Hrvoje!
On Tuesday, September 20, 2005 at 12:50:41 AM +0200, Hrvoje Niksic wrote:
> "HonzaCh" <[EMAIL PROTECTED]> writes:
>> the thousand separator (space according to my local settings)
>> displays as "á" (character code 0xA0, see attch.)
> Wget obtains the thousand separator from the ope
"HonzaCh" <[EMAIL PROTECTED]> writes:
>>> My localeconv()->thousands_sep (as well as many other struct
>>> members) reveals to empty string ("") (MSVC6.0).
>>
>> How do you know? I mean, what program did you use to check this?
>
> My quick'n'dirty one. See the source below.
Your source neglects
"HonzaCh" <[EMAIL PROTECTED]> writes:
> Latest version (1.10.1) turns out an UI bug: the thousand separator
> (space according to my local settings) displays as "á" (character
> code 0xA0, see attch.)
>
> Although it does not affect the primary function of WGET, it looks
> quite ugly.
>
> Env.: Wi
Jogchum Reitsma <[EMAIL PROTECTED]> writes:
> I'm not sure it's a bug, but behaviour descibes below seems strange
> to me, so I thought it was wise to report it:
Upgrade to Wget 1.10 and the problem should go away. Earlier versions
don't handle files larger than 2GB properly.
Rodrigo Botafogo <[EMAIL PROTECTED]> writes:
> [EMAIL PROTECTED]:~/Download/Linux> wget -c
> ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso
> --09:55:03--
> ftp://chuck.ucs.indiana.edu/linux/suse/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso
> => `SUSE-9.3-Eval-DVD.iso
Marc Niederwieser <[EMAIL PROTECTED]> writes:
> option --mirror is described as
> shortcut option equivalent to -r -N -l inf -nr.
> but option "-nr" is not implemented.
> I think you mean "--no-remove-listing".
Thanks for the report, I've now fixed the --help text.
2005-07-01 Hrvoje Niksic <
"Mark Street" <[EMAIL PROTECTED]> writes:
> Many thanks for the explanation and the patch. Yes, this patch
> successfully resolves the problem for my particular test case.
Thanks for testing it. It has been applied to the code and will be in
Wget 1.10.1 and later.
Hrvoje,
Many thanks for the explanation and the patch.
Yes, this patch successfully resolves the problem for my particular test
case.
Best regards,
Mark Street.
"Mark Street" <[EMAIL PROTECTED]> writes:
> I'm not sure why this [catering for paths without a leading /] is
> done in the code.
rfc1808 declared that the leading / is not really part of path, but
merely a "separator", presumably to be consistent with its treatment
of ;params, ?queries, and #fra
Will Kuhn <[EMAIL PROTECTED]> writes:
> Apparentl wget does not handle single quote or double quote very well.
> wget with the following arguments give error.
>
> wget
> --user-agent='Mozilla/5.0' --cookies=off --header
> 'Cookie: testbounce="testing";
> ih="b'!!!0T#8G(5A!!#c`#8HWs
Andrew Gargan <[EMAIL PROTECTED]> writes:
> wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz
>
> is splitting using on the first @ not the second.
Encode the '@' as %40 and this will work. For example:
wget ftp://someuser:[EMAIL PROTECTED]/some_file.tgz
> Is this a proble
Hi
wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz
is splitting using on the first @ not the second.
Is this a problem with the URL standard or a wget issue?
Regards
Andrew Gargan
Seemant Kulleen <[EMAIL PROTECTED]> writes:
>> Since I don't use Gentoo, I'll need more details to fix this.
>>
>> For one, I haven't tried Wget with socks for a while now. Older
>> versions of Wget supported of --with-socks option, but the procedure
>> for linking a program with socks changed s
Seemant Kulleen <[EMAIL PROTECTED]> writes:
> I wanted to alert you all to a bug in wget, reported by one of our
> (gentoo) users at:
>
> https://bugs.gentoo.org/show_bug.cgi?id=69827
>
> I am the maintainer for the Gentoo ebuild for wget.
>
> If someone would be willing to look at and help us wit
This problem has been fixed for the upcoming 1.10 release. If you
want to try it, it's available at
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2
<[EMAIL PROTECTED]>
Sent: Sunday, March 20, 2005 8:27 PM
Subject: Re: Bug
Hi Jorge!
Current wget versions do not support large files >2GB.
However, the CVS version does and the fix will be introduced
to the normal wget source.
Jens
(just another user)
When downloading a file of 2GB and m
Hi Jorge!
Current wget versions do not support large files >2GB.
However, the CVS version does and the fix will be introduced
to the normal wget source.
Jens
(just another user)
> When downloading a file of 2GB and more, the counter get crazy, probably
> it should have a long instead if a int
P> I don't know why you say that. I see bug reports and discussion of fixes
P> flowing through here on a fairly regular basis.
All I know is my reports for the last few months didn't get the usual (any!)
cheery replies. However, I saw them on Gmane, yes.
Dan Jacobson <[EMAIL PROTECTED]> writes:
> Is it still useful to mail to [EMAIL PROTECTED] I don't think
> anybody's home. Shall the address be closed?
If you're referring to Mauro being busy, I don't see it as a reason to
close the bug reporting address.
I don't know why you say that. I see bug reports and discussion of fixes
flowing through here on a fairly regular basis.
Mark Post
-Original Message-
From: Dan Jacobson [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 15, 2005 3:04 PM
To: [EMAIL PROTECTED]
Subject: bug-wget still useful
Quoting Alan Robinson <[EMAIL PROTECTED]>:
> When downloading a 4.2 gig file (such as from
> ftp://movies06.archive.org/2/movies/abe_lincoln_of_the_4th_ave/abe_lincoln_o
> f_the_4th_ave.mpeg ) cause the status text (i.e.
> 100%[+===>] 38,641,328 213.92K/sETA
>
Hi Jason!
If I understood you correctly, this quote from the manual should help you:
***
Note that these two options [accept and reject based on filenames] do not
affect the downloading of HTML files; Wget must load all the HTMLs to know
where to go at all--recursive retrieval would make no se
Title: Message
Put the URL in
double quotes. That worked for me.
Mark
Post
-Original Message-From: szulevzs
[mailto:[EMAIL PROTECTED] Sent: Sunday, December 26, 2004 5:23
AMTo: [EMAIL PROTECTED]Subject: bug
WGET can not download the following
link:
Wget --tries=5 http://ex
On Thu, 2 Sep 2004 04:28:39 +0200, Patrik Sjöberg <[EMAIL PROTECTED]> wrote:
> hi ive found the following bug / issue with wget.
> due to limitations wget bugs on files larger then "unsigned long" and
> displays incorrect size and also acts incorrectly when trying to download
> one of these files.
On Sun, Aug 22, 2004 at 08:02:54PM +0200, Jan Minar wrote:
> +/* vasprintf() requires _GNU_SOURCE. Which is OK with Debian. */
> +#ifndef _GNU_SOURCE
> +#define _GNU_SOURCE
This must be done before stdio.h is included.
> +#endif
> +#include
> +
> #ifndef errno
> extern int errno;
> #endif
>
1 - 100 of 197 matches
Mail list logo