Thanks everybody for the kind welcome. I hope I and Anthony will do a
good job here.
Now to be concrete: I guess the first step is to move the revision
control to Savannah. I have only a question, would you like to continue
using Mercurial or migrate to Git? The latter is becoming a de-facto
st
Micah Cowan writes:
> I think the main thing with migrating to Savannah is that there are a
> lot of "developers" who would have commit access, where there's really
> only a couple people in there who are active developers. So you may want
> to prune that list (see who's actually in the changelog
Hello Jens,
actually wget doesn't handle gzip compressed files, adding the
Accept-Encoding header is just a hack, pretending wget supports gzip
when it doesn't.
In order to use -p you need to download the file as plain text, not
forcing a compression.
Cheers,
Giuseppe
Jens Schleusener writes
Jens Schleusener writes:
> (for the Germans: Giuseppe spoken Tschuseppe ?)
It is more like: Jewseppee
This can help you better:
http://www.pronounceitright.com/pronuncia.php?id_pronuncia=3631
> Ok, that I was afraid. Maybe that should be mentioned shortly in the
> man page under the "--pa
Hi Micah,
Micah Cowan writes:
>> I think it will be cleaner to use gnulib in the same way as other
>> projects are doing it, not checking in the results but using a
>> "bootstrap" script. To force a specific revision of gnulib, a git
>> submodule can be used.
>
> Yeah, but then you need to comm
Hi,
Jeff, thanks for the patch; and also thanks to Hrvoje for the
review. I'll apply it as soon as we move to Savannah.
Cheers,
Giuseppe
Hrvoje Niksic writes:
> I am not the maintainer, but if you agree with my reasoning it certainly
> won't hurt to resubmit the patch.
Hello Linda,
what you need to give back your changes in GNU wget is to give copyright
assignments to the FSF; this is the only "overhead"; and of course the
changes must be accepted.
Cheers,
Giuseppe
Linda Walsh writes:
> If we wanted to modify wget and check back in changes
> what type of
Hello,
Jill Brandmeir writes:
> Hi,
> The following feedback came into a Sun feedback system from
> mike.irv...@atosorigin.com.
>
> I get error 403 Forbidden when I try wget
>
> The comment came in 4/26/10.
Can you please provide more information?
Cheers,
Giuseppe
Hello wget hackers,
I have migrated the GNU Wget repository from Mercurial to Bazaar.
The new repository is accessible here:
bzr branch http://bzr.savannah.gnu.org/r/wget/trunk
Are there pending patches that should be applied?
Cheers,
Giuseppe
Thanks for your bug report!
I don't have a Solaris 10 system to test my patch, but I have looked at
the generated `configure' file and it seems correct.
Would you mind to try this patch? To get a new `configure' you need to
execute `autoreconf'.
Cheers,
Giuseppe
=== modified file 'configure.
Hi Micah,
I have already committed a patch fixing it. I found the same problem on
some other projects as well :-)
Cheers,
Giuseppe
Micah Cowan writes:
> Douglas E. Engert wrote:
>> wget-1.12 configure on Solaris 10 would fail trying to look
>> at .. for a number of files. The problem appear
Sorry my mistake, I have noticed it after I sent the e-mail, but I
have committed the right version.
Thanks again for your report!
Giuseppe
"Douglas E. Engert" writes:
> I had to use src/wget.h as I don't see a src/wget.c
>
> autoreconf --force
> was also required and I ran this on a Ubuntu
Hi!
I have uploaded an alpha version of wget here:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2355.tar.bz2
It contains last changes as HTTP/1.1 support and a better gnulib
integration.
Can you please help me to test portability? It should build under MinGW
without problems now.
Please report here
Thanks for your report. This bug is already fixed in the source
repository.
Cheers,
Giuseppe
Ildar Isaev writes:
> Hi, i downloaded wget-1.12 from ftp://ftp.gnu.org/gnu/wget/wget-1.12.tar.bz2
>
> It turns out it has a null pointer dereference bug. This is how it may
> be reproduced.
>
> Expl
I don't see any problem, you have written it so you choose the license.
Anyway, if you don't have a specific reason, isn't be better to use GPL
instead of LGPL? :-)
Cheers,
Giuseppe
Crazy Pete writes:
> Hi everyone!
>
> I have created the following project:
>
> http://www.petenix.org/dolomed
Hello,
A new alpha version is here:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
and the detached gpg signature:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2.sig
It contains other changes to the build system. It should work fine on
MinGW/MSYS now.
Please report here any proble
Hi Jochen,
Jochen Roderburg writes:
> With the Bazaar repository I did (as you wrote in another mail to the
> mailing list) once
>
> bzr branch http://bzr.savannah.gnu.org/r/wget/trunk wget-bzr
>
> and have a local directory wget-bzr now.
>
> But a subsequent "bzr update" in wget-bzr only tells
Thanks for your patch. Recently I have done many changes to the build
system, would you please test this alpha version on your system?
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
Cheers,
Giuseppe
Rainer Orth writes:
> I've just tried to build wget 1.12 on Solaris 8/x86 with gcc 4.4
what web sites are you trying to access and what wget version are you
using?
It smells like chunked transfer encoding data that the server sends
careless of the HTTP version specified by wget. You can try to build
wget from the source repository, or using a recent alpha tarball where
HTTP/1.1 is
Hi Guillaume,
Guillaume Turri writes:
> Indeed, according to this page
> http://wget.addictivecode.org/RepositoryAccess I thought the current
> repository was the Mercurial one.
>
> How could I have found it out if I haven't read this mailing list?
> Have I made a mistake and checked a wrong pag
Ray Satiro writes:
> Is there a web interface for the new repository? Somewhere we'll be able to
> click a link and download a zip of different revisions, like
> the old repository. I couldn't find it so I tried bazaar 2.1.1 for windows
> but it tells me the wget tree isn't available and I ca
Hello,
thanks for you report.
Can you please to re-build wget applying this patch?
Cheers,
Giuseppe
=== modified file 'src/css-tokens.h'
--- src/css-tokens.h2010-05-08 19:56:15 +
+++ src/css-tokens.h2010-05-24 10:04:21 +
@@ -61,6 +61,6 @@
NUMBER,
URI,
FUNCTION
-} css_
Alexander Lane writes:
> I've encountered a website that does not put the ">" at the end of
> some of its img tags. Wget skips downloading those images as a result,
> but I checked several web browsers & they were all able to cope with
> it.
>
> I don't know whether this was done in an attempt to
it can't be done directly from wget.
If the domain doesn't appear in the pages then sed or perl can be enough.
Another solution/hack can be to place the files you have downloaded on a
local web server, that handles everything as a static file, in a way
that "http://foo.com/bar/baz"; can be access
it can't be done with a single call to wget but you need a script. This
shell function can help you to get the desired pdf file.
function download_article
{
until fgrep "POST" $1.html; do
wget -O $1.html --keep-session-cookies \
--save-cookies=cookies.$1 --load-cookies=co
Hello,
thanks for your report. I am not sure that the URL normalisation should
collapse multiple consecutive forward slashes, I don't see anything
about it in RFC 1808. We can't assume that "foo//bar" is the same as
"foo/bar", it could be handled differently by the server, for example it
may be
Thanks!
I have removed wsock32, now the wget executable on Windows links to
ws2_32.
Cheers,
Giuseppe
Keisial writes:
> If you use ws2_32 you don't need to link with wsock32. In fact wsock32
> is mostly forwarded functions.
> Linking to wsock32 and not ws2_32 has the advantage that it works i
Keisial writes:
> SciFi wrote:
>
>> Another point is that all of wget's perl shell procs are hard-coded with
>> #!/usr/bin/perl which again points to Apple's and not the newer one we
>> installed from ActiveState.com. The version mismatch causes symbols to be
>> missed during make's generation o
Thanks for the report. It should be fixed now, as "-I ../md5" is not
present anymore.
Can you please test this alpha tarball?
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2359.tar.bz2
Cheers,
Giuseppe
Jay K writes:
> % uname
> OSF1
> % uname -r
> V5.1
> % cc -V
> Compaq C V6.3-025 on Compaq Tru
et -V it says:
>
> Currently maintained by Micah Cowan .
>
> ... which I believe isn't true and I believe you want a patch similar to
> this:
>
> === modified file 'src/main.c'
> --- src/main.c2010-05-31 07:45:03 +
> +++ src/main.c2010-06-10 07:06:32 +
> @@ -853,7 +853,7 @@
> names such as this one. See en_US.po for reference. */
>fputs (_("\nOriginally written by Hrvoje Niksic .\n"),
> stdout);
> - fputs (_("Currently maintained by Micah Cowan .\n"),
> + fputs (_("Currently maintained by Giuseppe Scrivano
> .\n"),
> stdout);
>fputs (_("Please send bug reports and questions to .\n"),
> stdout);
I would rather drop this line at all.
Thanks!
Giuseppe
Thanks for the patch!
> @@ -2256,6 +2258,7 @@ File %s already there; not retrieving.\n
>else
> logprintf (LOG_VERBOSE, _(", %s remaining"),
> number_to_static_string (contlen));
> + hs->resumelen = contlen;
>
Thanks for your patch but I can't apply it since this fix is already
present in the source repository.
Cheers,
Giuseppe
tho writes:
> --- http.c.orig 2010-06-10 14:01:43.0 +0200
> +++ http.c2010-06-10 14:01:55.0 +0200
> @@ -1829,6 +1829,13 @@
>/* Check for status
Muthu Subramanian K writes:
> oh...I missed it :) just checked if it works for me...sorry about that...
> Btw, would other protocols work as well (say, multiple http and ftp downloads
> with resume)?
I have checked multiple FTP URL's with resume and it seems to work as
expected. Thanks again.
Hi,
I would like to drop completely the windows/ subdirectory. The build
under Windows can be done easily with Cygwin or MinGW in the "GNU way":
./configure && make.
My feeling is that the Makefile's present under windows/ need more
effort to be kept updated, than the real benefits we can get fr
Hello,
thanks for your contribution.
I have some comments:
"v...@mage.me.uk" writes:
> Thanks for the encouragement! I've attached a patch which should tell
> the user there is a problem with the system wgetrc file and exit. Seems
> suspiciously simple, can anyone spot any problems with it?
Hrvoje Niksic writes:
> "v...@mage.me.uk" writes:
>
>> Thanks for the encouragement! I've attached a patch which should tell
>> the user there is a problem with the system wgetrc file and exit. Seems
>> suspiciously simple, can anyone spot any problems with it?
> [...]
>> + /*If there are any p
Daniel Stenberg writes:
>> Please change the type of the variable `ok' to `bool' and include
>> this change in your patch, also include .
>
> Are you then dropping everything pre C99? I'm just curious as I
> thought wget traditionally aimed to work fine even with older
> compilers.
gnulib ensure
"giovanni_re" writes:
> When trying to do a continue "-c" wget for a partially downloaded file, wget
> wouldn't do the continue because it got the "302 Moved Temporarily" (so i
> surmise).
>
> How can it be told "if you get the "moved temporarily" message, just do a
> proper "-c" to the filena
Solar Designer writes:
> I assume that you meant the issue with single-file downloads, not the
> specific piece you quoted.
>
> I think that Giuseppe Scrivano is the one to comment and decide on this.
I have sent Florian Weimer the information to request copyright
assignments pape
Hi Jochen,
before my change, the HEAD request was done every time
--content-disposition was specified.
Other cases where HEAD is really needed must work as before, if they
don't then it is a bug.
Thanks for the information, I am going to check it right now.
Giuseppe
Jochen Roderburg writes:
Jochen Roderburg writes:
> I ask this because after this change I observed significant random
> delays on my daily wget downloads (summing up to several hours ;-) in
> a certain complicated situation involving a proxy.
I have just checked and -N works as expected. Can you please provide
more de
can you try specifying the proxy credentials using "proxyuser" and
"proxypassword"?
Cheers,
Giuseppe
arvind chaudhary writes:
> I have set-up proxy in /etc/wgetrc file as
> https_proxy = https://arvind:pas...@192.168.0.4:3128/
> http_proxy = http://arvind:pas...@192.168.0.4:3128/
> ftp_proxy
Jochen Roderburg writes:
> With a filename coming over Content-Disposition?;-)
I see now. Sorry the misunderstanding.
I have pushed a fix.
Thanks,
Giuseppe
Jochen Roderburg writes:
> I have to admit of course, that the combination of a usable
> Modify-Date and a Content-Disposition filename will be very rare "in
> the wild", but nevertheless possible. I have seen the
> Content-Disposition headers mostly when the reply data is dynamically
> generated
Hello Christopher,
I have spent some time in the past weeks to fix the Windows build,
with the last alpha tarball,"./configure && make" should be enough to
build wget. Do you know other problems under Windows (beside IPv6, SSL
and NTLMv1)?
What I can say is that they should be fixed before the n
Micah Cowan writes:
> On 06/14/2010 08:32 AM, Giuseppe Scrivano wrote:
>> By the way, I see that currently OpenSSL is preferred over GNU TLS (not
>> only under Windows), I would invert this.
>
> The current GNU TLS support is broken: that needs to be fixed first. My
>
Thanks for your report! I have updated the help string for
--random-wait.
Cheers,
Giuseppe
Tom Mizutani writes:
> I recently look into Wget documentations and found the
> specification of "--random-wait" option seems to have been
> changed, but not explicitly announced in "ChangeLog"s.
> To
Doruk Fisek writes:
> Sat, 12 Jun 2010 16:54:08 +0400, Solar Designer :
>
>> > Is there going to be a development in this issue?
>> I assume that you meant the issue with single-file downloads, not the
>> specific piece you quoted.
>> I think that Giuseppe S
Solar Designer writes:
> I think Florian should have replied to you by now. Please confirm.
> (I just want to ensure that a possible loss of an e-mail message doesn't
> result in duplicate work or whatever.)
I haven't received any reply yet.
> As an alternative to copyright assignment to the
writes:
> I have been using wget, together with 'time', with the following command line
> parameters:
>
> time wget --page-requisites --secure-protocol=SSLV3 --load-cookies
> cookies.txt --keep-session-cookies
> https://portal.foo.com/test/appmanager/portal/desktop
>
> However, when I do this,
Hello,
I have uploaded a new alpha tarball. The main issues addressed are IPv6
detection under Windows and the GNU TLS backend, that should work again
now.
GNU TLS is used by default, if you want to use OpenSSL you need to
specify --with-ssl=openssl at configure time; if it is possible, please
d
Hello Timothy,
this bug is fixed by the commit 2363. It is present in the last alpha
tarball: ftp://alpha.gnu.org/gnu/wget/wget-1.12-2392.tar.bz2.
Or you can apply this small patch.
=== modified file 'src/host.c'
--- src/host.c 2010-05-08 19:56:15 +
+++ src/host.c 2010-05-25 16:36:02 +000
"Paul" writes:
> "When wget encounters a URL that ends in a slash, since there's no file
> name, wget has no idea what it should name it, but does have to name it
> something. So it goes with the traditional "index.html" filename."
>
> I would like to prevent the downloading of this file retrei
vivi writes:
> Hi all, in an attempt to add a config option this error occured when
> running "./wget --config=/some/place/wgetrc google.com"
how have you changed "struct cmdline_option option_data" in main.c? I
get the error you have reported if the `data' member is different than
"config".
C
Hi Nguyen,
there was a similar discussion on this mailing-list not much ago:
http://lists.gnu.org/archive/html/bug-wget/2010-06/msg00108.html
Cheers,
Giuseppe
Nguyen Kim Son writes:
> Hello,
>
> I'd like to know if there exists an API for wget? I am trying to write a
> small program for do
We are still waiting for the FSF to receive copyright papers.
In the meanwhile, Florian, can you please send me a cleaned copy of your patch
that works with the last bazaar revision and include an entry for the
ChangeLog file?
Thanks,
Giuseppe
Doruk Fisek writes:
> So what now?
>
> It's been
Hi,
thanks for the report. I have changed this function name to don't clash
with the gnulib module (that is used by gnutls).
I have attached a patch, can you try it?
Cheers,
Giuseppe
Ploni Almoni writes:
> D:/Documents/username/Msys/home/username/usr/lib\libgnutls.a(read-file.o):
> In func
Hello,
I am not convinced it is desiderable to implement this feature in wget
as it can be easily done with sha1sum or md5sum (and a small script).
These tools can also read sums from files and verify them.
It is quite meaningless, I think, to print the checksum to the screen as
it can't easily
Hi Hrvoje,
we can relax the gettext version, but the bootstrap script is used only
when the code is compiled from the source repository and gettext is
needed to generate files used by gettext later, and that are distributed
in source releases. It is a dependency only in the bootstrap phase. If
t
Jochen Roderburg writes:
> With the brand-new autoconf v2.66 the wget build process does not work
> any longer with error message:
>
> configure.ac:172: error: AC_CHECK_SIZEOF: requires literal arguments
> ../../lib/autoconf/types.m4:765: AC_CHECK_SIZEOF is expanded from...
> configure.ac:172: th
Caleb Cushing writes:
> there might be a reason. but if you use timestamp and continue
> together it only seems to utilize continue.
Thanks for your report, I am going to commit this patch. It should fix
the problem you have reported. Does it work for you?
Cheers,
Giuseppe
=== modified fi
Jozua writes:
> Hi.
>
> When continuing a ftp download, an incorrect value is used for the
> total file size.
> The SIZE command returns the correct size, but the value used comes
> from the response to the RETR command, which (at least in this case)
> is the number of bytes remaining after REST.
Hi Daniel,
Daniel Stenberg writes:
> I consider the size from SIZE to be much more reliable than the size
> you need to "guess" from the RETR response - based on the fact that
> SIZE has a documented way to return the exact size while RETR has
> not. No matter which happens to be the biggest...
Hi Minato,
I wasn't able to reproduce this problem, I have tried with a directory
containing 5 files.
What do you get using this command (be sure that website.com is the real
one)?
strace -e open,socket wget -q -nc -r -l inf --no-remove-listing \
http://website.com/ 2>&1 | tail
Can
Hello Minato,
Minato Namikaze writes:
> socket(PF_INET, SOCK_DGRAM|SOCK_NONBLOCK, IPPROTO_IP) = -1 EMFILE (Too many
> open files)
> socket(PF_INET, SOCK_DGRAM|SOCK_NONBLOCK, IPPROTO_IP) = -1 EMFILE (Too many
> open files)
thanks for your further investigation. Can you please copy more lines
you can be interested in this previous discussion:
http://lists.gnu.org/archive/html/bug-wget/2010-07/msg7.html
Does it solve the problem for you?
Cheers,
Giuseppe
Chaim Millet writes:
> Why doesn't wget use the system proxy settings?
> (ubuntu 10.04)
>
> Would it be possible to add tha
Hi Linda,
yes, wget doesn't parse javascript. This can be a good addition, at
least for simpler cases.
Cheers,
Giuseppe
Linda Walsh writes:
> --page-requisites doesn't pull down everything needed to display the page.
>
> Example of bug:
> Try to pull down any of the
> *descriptions*,
Florian Weimer writes:
> I wonder if it makes any sense to work on this patch. Perhaps you
> should find someone with an assignment on file who can do a clean-room
> reimplementation?
sorry the huge delay, there were some problems at the FSF with the
copyright assignments process.
As soon as y
Caleb,
Caleb Cushing writes:
> I thought it was working because sometimes I seemed to get the files
> updated but the last couple of days no updates came in, so I turned
> continue off and by timestamp check alone it downloaded new files. So
> I guess it must not work all the time :( or somethin
I couldn't reproduce this problem, can you please give me more
information?
Can you use the -d flag to wget and see what happens?
Thanks,
Giuseppe
Avinash writes:
> Hi All,
>
> I am trying to use --spider option with an URL ending with '/'.
> The error I am getting is, index.html not found.
What wget version are you using? It works well for me using wget 1.12.
Cheers,
Giuseppe
gary jefferson writes:
> I'm using 'wget -nc -p -k -r --default-page=index.html http://webpy.org/'
>
> This works fine for most of the site, but fails on pages such as
> "http://webpy.org/faq";. Instead o
Thanks for your contribution! It looks good but I want to check the
patch better before use it.
Cheers,
Giuseppe
John Trengrove writes:
> This is a patch to change the behaviour for FTP directory listing.
> Currently the hours are printed only if the hour is non-zero and does
> not account f
already exists and it is a directory, we should assume that the
destination file is "foo/default.html".
Giuseppe
gary jefferson writes:
> wget --version returns 1.12 here too.
>
> On Sat, Jul 24, 2010 at 2:57 AM, Giuseppe Scrivano wrote:
>> What wget version a
it seems good. I will send you in another email details to get
copyright assignments papers to the FSF.
Cheers,
Giuseppe
Merinov Nikolay writes:
> I writting patch, that add "--unlink" options for unlink file instead
> clobbering.
>
> I also posted this patch in discussion on
> http://savann
cal
+file name.
+
@item continue = on/off
If set to on, force continuation of preexistent partially retrieved
files. See @samp{-c} before setting it.
=== modified file 'src/ChangeLog'
--- src/ChangeLog 2010-07-20 17:42:13 +
+++ src/ChangeLog 2010-07-28 18:44:11 +
Hrvoje Niksic writes:
> This is a backward-incompatible change that should be announced more
> clearly in NEWS, and possibly in a release announcement.
Thanks, I am going to clarify it.
Thanks again for your contribution. I have just pushed it.
Cheers,
Giuseppe
John Trengrove writes:
> This is a patch to change the behaviour for FTP directory listing.
> Currently the hours are printed only if the hour is non-zero and does
> not account for if we received the hours & minutes
Jochen Roderburg writes:
> OTOH I also saw that the patch as such is not yet complete and does
> not yet cover all aspects of the underlying problem.
> It seems that setting contentdisposition=on (what I also have
> permanently in my wget configuration) circumvents the patch. Not only
> when a Co
Hi Alexandre,
Alexandre Vieira writes:
> Is it possible to have wget to send a different POST for each line on the
> file in an HTTP 1.1 persistent connection or HTTP 1.0
> "Connection: Keep-alive" connection? Like having the connection up and feed
> it trough stdin or something similar?
This
Hi Hrvoje,
Hrvoje Niksic writes:
> That thread doesn't really address the question of Wget "using the
> system proxy setting" that the OP is asking for. I've just tried the
> following sequence of steps:
>
> 1. configure a proxy in (ubuntu/gnome) system->preferences->network
> proxy
> 2. start
Hello!
thanks for working on this.
vivi writes:
> Hi all
>
> I've got the config option working to some extent, however, after the
> first run of getopt_long, it ignores the second getopt_long that takes
> the rest of the users options. So any options other than --config are
> ignored. I'm not
vivi writes:
> Great, it works now! However, one problem I found is that if a
> unrecognized option is given, an error is posted twice, most likely due
> to getopt_long running twice.
To inhibit getopt to print errors you should reset opterr.
I am still unsure about using it twice; what we can
Hello Inaki,
can you also specify -d and post here the last lines you get (or any
other message that you consider useful for us)?
As another test, can you use valgrind to trace memory usage of wget?
What does "ulimit -m" tell you?
Cheers,
Giuseppe
Inaki San Vicente writes:
>Hi the
Johnny writes:
> I am trying to fetch a complete set of pdf docs, whereof some are
> "hidden" in a collapsible list; if you visit the site you must expand
> the list to get the docs. Usind wget, I cannot get all the files (the
> top level files downloads, but not the rest).
>
> This is what I tri
vivi writes:
> === modified file 'src/ChangeLog'
> --- src/ChangeLog 2010-08-01 20:55:53 +
> +++ src/ChangeLog 2010-08-05 17:55:28 +
> @@ -1,3 +1,18 @@
> +2010-08-04 Reza Snowdon
> +
> + * main.c (main): inserted 'defaults'
> + * init.c: Include stdbool.h.
> +
Hrvoje Niksic writes:
> Giuseppe Scrivano writes:
>
>> I have followed exactly your same steps under Fedora/Gnome and I get
>> this:
>>
>> $ env | grep -i proxy
>> NO_PROXY=localhost,127.0.0.0/8
>> http_proxy=http://localhost:8080/
>
> OK. Did
Keisial writes:
> That scss file contains several entries like:
> background-image:url( );
> background-image:url();
I have applied the following patch. It should fix the issue reported.
Cheers,
Giuseppe
=== modified file 'src/css-url.c'
--- src/css-url.c 2010-07-29 23:00:26 +
++
I have just uploaded a new alpha version of wget.
Since last alpha release different bugs were fixed and the GNU TLS
backend was improved.
For more details you can look at the ChangeLog or using this command:
"bzr log -r2392..2416" after you checkout a fresh copy of Wget from the
Bazaar repository
Petr Pisar writes:
> On Sun, Aug 08, 2010 at 02:13:56PM +0200, Giuseppe Scrivano wrote:
>> I have just uploaded a new alpha version of wget.
>>
> Great. Could you please upload new gettext template into Translation Project?
Thanks for the suggestion, I'll do it.
Giuseppe
Hello,
David writes:
> Hi all,
>
> I'm having issues with Wget 1.12 (on Ubuntu 10.04, 32 bit) and Wget
> skipping converting links with spaces in them. Wget downloads the
> linked files fine if they have spaces in the filenames, but skips
> converting the link to relative when the --convert-li
what about using "wget -nd -P dvd ftp://host/tools/test01/dvd"; ?
Giuseppe
art N writes:
> Hi,
>
> I'm using wget and came across a issue where I want to copy only the last
> dir from FTP to local system.
> The copy works fine but the problem is that wget creates the ftp full path.
> I don't
Hello,
It seems the .pdf files are on a different host. Have you tried
specifying --span-hosts option?
Giuseppe
sl...@lavabit.com writes:
> Hello,
>
> I tried wget on a particular page and it was not able to find the download
> link of the pdf-file. I made a copy of that page which you can
Manuel Reinhardt writes:
> When downloading an html document with
>
> wget -E -k -p http://...
>
> I noticed that sometimes some of the Stylesheets are not found when
> opening the local copy. This happens when the html uses a CSS @import
> statement that includes a URL containing spaces encoded
Hi Cheng,
the development version is hosted on the bazaar repository.
If you have problems accessing the bazaar repository then you can try
with this alpha version, rebasing your patch on it:
ftp://alpha.gnu.org/gnu/wget/wget-1.12-2416.tar.bz2
Anyway, the bazaar repository must be accessible
Hello,
thanks for your report. This bug is fixed in the source repository and
will be included in the next wget release.
Cheers,
Giuseppe
"Marc R.J. Brevoort" writes:
> Hello all,
>
> Apparently this is a combined gnome config/wget issue so I'm posting
> it individually to both sides.
>
> I
Hello,
What output do you get with these two commands?
wget -d --post-file=file http://www.web.site.org/ 2>&1 | grep ^Content
wget -d --post-data=".." http://www.web.site.org/ 2>&1 | grep ^Content
if the output is identical, then please attach the full debug log
(remove the pipe to grep) for bo
Charles Kozierok writes:
> Giuseppe, here's the output requested. Thanks.
Can you attach the working version too?
Thanks,
Giuseppe
so we have:
Charles Kozierok writes:
broken version:
> POST /specifications.aspx HTTP/1.0
> Referer: http://tmslighting.digiflare.com/specifications.aspx
> User-Agent: Wget/1.11.4
> Accept: */*
> Host: tmslighting.digiflare.com
> Connection: Keep-Alive
> Content-Type: application/x-www-form-ur
Charles Kozierok writes:
> But I guess the bigger question is why a Windows version of wget
> wouldn't gracefully handle removal of the standard end-of-line
> characters. I'm not even sure I can figure out how to properly write
> the data out to the file without the 0A0D at the end.
As an user I
"Dennis, CHENG Renquan" writes:
> On Thu, Sep 16, 2010 at 4:55 PM, Giuseppe Scrivano wrote:
>> Hi Cheng,
>>
>> the development version is hosted on the bazaar repository.
>>
>> If you have problems accessing the bazaar repository then you can try
>
1 - 100 of 908 matches
Mail list logo