Follow-up Comment #4, bug #23281 (project wget):
> What's wrong with pushing/discussing them upstream?
Absolutely nothing; I think that's entirely appropriate. I don't really have a
vested interest in it any longer, though, so it'd probably be better for
someone else to get the ball rolling with
Follow-up Comment #2, bug #23281 (project wget):
This bug was opened back when I was a maintainer, which I'm not now, nor am I
actually involved in wget development in any way these days, so perhaps this
should be reconsidered by current the maintainer.
But yes, improving gnu_getpass would defini
On Tue, Oct 21, 2014 at 8:55 AM, Pär Karlsson wrote:
> Thanks!
>
> It didn't even occur to me to check this out. My only excuse is gratuitous
> consistency and lack of pure C experience; a malloc() without a
> corresponding sizeof() seemed a little arbitrary to me, but it makes sense
> now :-)
We
On Mon, Oct 20, 2014 at 7:02 PM, Yousong Zhou wrote:
> I am not sure here. Do we always assume sizeof(char) to be 1 for
> platforms supported by wget?
FWIW, sizeof(char) is always 1 by definition; the C standard
guarantees it. Even on systems with no addressable values smaller than
16 bits, beca
On Oct 18, 2014 6:43 AM, "Bryan Baas" wrote:
>
> Hi,
>
> I was wondering about the command output of wget. I used a Java Runtime
> exec and, although the wget process ended with a 0 completion code, the
> results appeared in the error stream and not the output stream.
Hi!
It should be noted tha
On Sat, Sep 13, 2014 at 1:30 PM, Micah Cowan wrote:
> There is currently _one_ textcha still in operation. If that
> falls, I'll switch to an unanswerable textcha (no one can edit) until
> I can figure out a better long-term solution.
Update: editing is now disabled, spammers got m
Hey folks,
So the Wget Wgiki is still alive and kicking, but dealing with spam is
getting out of hand.
Moin anti-spam works with a combination of global blocklisting, and
"textchas" (text questions designed to be easy for humans, hard for
robots).
The problem is that eventually a human gets invo
Anyone have thoughts on a designated prefix (say, make-style "-") that
indicates a line that can be safely ignored if not understood?
Might also work to have a pragma thingie in the .wgetrc, to turn
fail-on-error on and off.
Naturally, the value of such a thing wouldn't be seen until wgets instal
On Mon, Dec 2, 2013 at 3:15 PM, Fernando Cassia wrote:
> Hi Micah,
>
> You are listed as the current maintainer of wget.
Listed where? See http://www.gnu.org/software/wget/ for official
information about wget; I haven't been maintaining it for a few years now.
That'd be Giuseppe Scrivano. But s
Hi Andrew.
I no longer have much involvement with Wget, other than in discussions. I'm
copying the Wget mailing list in my reply.
To my knowledge, there are no "officially" builds of Wget for Windows (the
developers only provide the source code releases), and neither of the links
you mention are
On Sat, Aug 03, 2013 at 11:50:48PM +0200, Ángel González wrote:
> On 03/08/13 21:07, Micah Cowan wrote:
> >On Sat, Aug 03, 2013 at 04:11:59PM +0200, Tim Rühsen wrote:
> >>As a second option, we could introduce (now or later)
> >>--name-filter-program="prog
On Sat, Aug 03, 2013 at 04:11:59PM +0200, Tim Rühsen wrote:
> As a second option, we could introduce (now or later)
> --name-filter-program="program REGEX"
>
> The 'program' answers each line it gets (the original filename) by excactly
> one
> output line (the new filename) as long as Wget
On Thu, Aug 01, 2013 at 01:24:12PM +0200, Giuseppe Scrivano wrote:
> Tim Ruehsen writes:
>
> > That is basically a good idea.
> >
> > Do you have in mind to keep as close to the standard CGI environment
> > variables
> > as possible ? Or do you think of the CGI environment principle ?
> > If th
On Fri, Aug 02, 2013 at 11:53:24AM +0200, Tim Ruehsen wrote:
> Hi Dagobert,
>
> > All this added complexity seems highly overengineered for a feature
> > that is not in the core functionality of the tool and that only a
> > fraction of the users use. Keep in mind: a good tool is one that does
> >
On Wed, Jul 31, 2013 at 06:32:18PM -0400, Andrew Cady wrote:
> By that, I assume you mean to execute the option in the shell. So the
> existing usage:
>
> --rename-output=s/x/y/
>
> would (almost) become:
>
> --rename-output='perl -lpe "BEGIN{\$|++}" -e s/x/y/'
It COULD, sure, but why on e
On Fri, Jul 26, 2013 at 02:30:00PM -0400, Andrew Cady wrote:
> Incidentally, the former maintainer of wget, Micah Cowan, actually
> started working on a wget "competitor" (so to speak) based on a plugin
> architecture designed around this concept:
Thanks for the mention. :)
> From: "Towle, Jonathan J."
> To: "bug-wget@gnu.org"
> Cc:
> Date: Fri, 31 May 2013 15:12:40 +
> Subject: [Bug-wget] Compatibility
> We are running various versions of wget on our servers.
>
> On the server we need to use, we are running the very old version 1.9.1.
> We could upgrade eit
On Fri, May 10, 2013 at 12:25 AM, Hauke Hoffmann
wrote:
> Is that correct or is that a typo in the manpage so that it should be:
>
>/"Set number of tries to number. Specify 0 or inf for infinite
>retrying. [...]"/
Great catch!
-mjc
The Wgiki has been restored to its state as of about a month ago, sans
spam. Steps have been and are being taken to prevent a similar
occurrance from happening again.
-mjc
f you wish to
make modifications to the site.
Yours,
Micah Cowan
to be redone). Leaving it until tomorrow to restore
and investigate. :p
-mjc
On Sat, May 4, 2013 at 9:24 AM, Micah Cowan wrote:
> On Sat, May 4, 2013 at 12:53 AM, Micah Cowan wrote:
>> I just did mass deletions of pages and users.
>>
>> I MAY HAVE ACCIDENTALLY DELETED IMPORTA
On Sat, May 4, 2013 at 12:53 AM, Micah Cowan wrote:
> I just did mass deletions of pages and users.
>
> I MAY HAVE ACCIDENTALLY DELETED IMPORTANT PAGES AND/OR REAL USERS
I failed to follow this up with the more important point: I have
backups, and can restore individual pages and users
I just did mass deletions of pages and users.
I MAY HAVE ACCIDENTALLY DELETED IMPORTANT PAGES AND/OR REAL USERS
Some long-needed spring cleaning. Somehow at some point my
subscription to all page updates was disabled if they're "trivial"
changes. Perhaps there was some exploit that allowed a spam
Restored from backup. RecentChanges still hangs forever for some reason...
will have to look into that later. Re-enabled "textchas" for "known" users
(users with account names).
On Thu, May 2, 2013 at 12:32 PM, Tim Rühsen wrote:
> Am Donnerstag, 2. Mai 2013 schrieb Micah Cowan:
>> Ah, yeah that's a decent point. I like it, but then, we run into
>> name-trusting problems along the lines of why --trust-server-names was
>> introduced, if w
> On Thu, May 2, 2013 at 9:00 PM, Tim Ruehsen wrote:
>>
>> Darshit, I guess you are talking about redirection.
>>
>> That is 'wget -r gnu.org' is being redirected to www.gnu.org (via Location
>> header). Wget now follows the redirection, but only downloads index.html
>> since
>> all included URLs
I believe you want -H -D gnu.org. That's what it's for. Wget doesn't
know which hostnames under a domain should be allowed and which should
not be (do you want images.gnu.org? git.gnu.org? lists.gnu.org?), so
turns 'em all off unless you ask for them explicitly.
HTH,
-mjc
On Thu, May 2, 2013 at 4
Steven Schubiger wrote a script, found in the repo's
util/paramchecks.pl (not found in tarballs) that checks for just this
sort of situation (and others). Running it produces a few false
positives (such as "--no", and yeah, I'd consider --htmlify and
--no-htmlify to be legitimately undocumented), a
On 04/01/2013 11:35 PM, Noël Köthe wrote:
> Am Montag, den 01.04.2013, 20:53 -0700 schrieb RJ:
>> Is it possible to get a copy of the Latex source code for the Wget
>> instruction manual?
>> I would like to print a few copies for instructional purposes, but
>> need to first compile in a smaller fon
On 03/14/2013 02:08 PM, Pavel Kačer wrote:
> If a word begins with an unquoted tilde character (`~'), all of the
...
> To do what you want just write
>
>$ wget --post-file ~/vimrc www.example.com
>
> or
>
>$ wget --post-file "~/vimrc" www.example.com
On 03/14/2013 12:15 PM, Darshit Shah wrote:
> In fact I wrote this to specifically expand command line options, since
> bash did not expand the tilde in the filename I gave through the
> command line.
> Here is the output I got.
>
> $ wget --post-file=~/vimrc www.example.com
>> --2013-03-15 00
On 12/09/2012 02:45 AM, Tim Rühsen wrote:
> Am Samstag, 8. Dezember 2012 schrieb 7382...@gmail.com:
>> Hello
>>
>> I think wget should HTTP compression (Accept-Encoding: gzip, deflate). It
>> would put less strain on servers being downloading from, and use less of
>> their bandwidth. Is it okay to
On 12/09/2012 04:11 AM, Giuseppe Scrivano wrote:
> 7382...@gmail.com writes:
>
>> I think wget should HTTP compression (Accept-Encoding: gzip, deflate). It
>> would put less strain on servers being downloading from, and use less of
>> their bandwidth. Is it okay to add this idea to the
>> http://w
On 11/17/2012 02:24 PM, Voytek Eymont wrote:
> what;s the best way to reduce the log verbosity to minimum
Is the -nv option perhaps what you're looking for?
-mjc
On 10/30/2012 01:30 PM, Ángel González wrote:
> On 30/10/12 19:37, Alex wrote:
>> Greetings, Dmitry Bogatov.
>> Thanks for reply.
>> Yes, thanks it is may be possible to get all files list, convert it to
>> readable codepage and rename files. Sorry, inertia of thinking - Far
>> 1.75 ever can't fin
On 10/12/2012 06:38 AM, Paul Beckett (ITCS) wrote:
> I am attempting to use wget to create a mirrored copy of a CMS (Liferay)
> website. I want to be able to failover to this static copy in case the
> application server goes offline. I therefore need the URL's to remain
> absolutely identical. T
On 09/01/2012 09:43 AM, Alex wrote:
> Greetings, Andriansah.
> Sorry for bad English.
>
> "-c" and "--content-disposition" don't work together.
I doubt this is true. They certainly used to, and it was always intended
that they would (though, it was a bit messy, since it requires us to
start downl
On 08/30/2012 10:50 PM, Andriansah wrote:
> Dear bug tracker
>
> I have finish download file with command
> wget -c --content-disposition 'link_download'
>
>
> Accidentally I run that program again and surprisingly, it download the
> file again
>
> I'm using latest version of wget 1.13.4 on ubu
Hi Patrick. I don't maintain (or do development) on wget currently. I'm
copying to the Wget mailing list. Thanks!
-mjc
On 08/30/2012 09:21 AM, Patrick Castet wrote:
>
> hi,
>
> man wget
> wget --help
>
> --spider option
>
>
>
>
>
> present man:
>--spider
>
On 08/24/2012 08:56 AM, Tim Ruehsen wrote:
> Meanwhile I am working on more test routines. So far it's only kind of unit
> testing. But after finishing that, i'll write a test small http/https server
> (using mget net routines) that could offer as many tests as we need
> (timeouts,
> authorizat
On 08/22/2012 03:03 PM, David Linn wrote:
> Hi,
>
> I was wondering how to maintain a git branch where I can test things out.
> Running bootstrap and configure generate a bunch of extra files. Do these
> files need to be tracked ? Can I just put all of them in .gitignore ? I'm
> new to git (versio
On 08/16/2012 01:36 AM, Tim Ruehsen wrote:
> It would be perfect, to have a large test suite. If someone works out a test
> suite design for wget1, I would spend some time into the coding.
wget1 already has a test suite. It most likely needs to be expanded with
enough tests to provide more comple
On 08/13/2012 03:06 AM, Daniel Stenberg wrote:
> On Mon, 13 Aug 2012, Tim Ruehsen wrote:
>
>> But we should not forget about a monolithic, backward-compatibel (to
>> wget 1.x) wget 2.0. We all agree, it is time to redesign wget's code
>> architecture to have a clean codebase for new features to im
On 08/13/2012 02:01 AM, Tim Ruehsen wrote:
> And now back to Micah and Niwt. How can we join forces ?
> It should make sense to share code / libraries and parts of the test code.
It should be noted that I chose a MIT/2-clause BSD-style license for
Niwt, so any sharing would necessarily be one-dir
On 08/09/2012 12:42 AM, ptrk mj wrote:
> Greetings everyone,
>
> I'd like to know what is the technical difference between
>
> "Connection closed at byte x."
> and
> "Read error at byte x/y (Connection timed out)."
AIUI,
"Connection closed at byte x" means that the remote end closed the
connect
> On Thursday 24 May 2012 17:14:30 Mike Frysinger wrote:
>> On Thursday 24 May 2012 16:31:04 Giuseppe Scrivano wrote:
>>> Mike Frysinger writes:
Newer versions of openssl ship with pkg-config files, so if we can
detect it via those, do so. If that fails, fall back to the classic
me
On 08/07/2012 11:29 PM, Ray Satiro wrote:
>> From: Alex
> [...]
>
>> First three with assertion ('ioctl() failed. The socket could not be set as
>> blocking.').
>
> That's fatal. You shouldn't see 'ioctl() failed. The socket could not be set
> as blocking.'. I don't see that on Vista x86. Whic
On 08/07/2012 06:18 PM, illusionoflife wrote:
> On Tuesday, August 07, 2012 11:08:40 you wrote:
>> I think the maintainer is aware that Wget's code quality is poor, and
>> would welcome sweeping architectural changes; I know I would have, when
>> I was maintainer.
> Of course, but we can have diffe
On 08/07/2012 02:42 PM, Fernando Cassia wrote:
> On Tue, Aug 7, 2012 at 3:08 PM, Micah Cowan wrote:
>> I think the maintainer is aware that Wget's code quality is poor, and
>> would welcome sweeping architectural changes; I know I would have, when
>> I was maintainer.
&g
(I've abridged and reordered the original post, for brevity's sake)
On 08/07/2012 08:12 AM, illusionoflife wrote:
> I really want to contribute to Wget, but I am afraid that such radical
> changes
> will not be accepted with
I think the maintainer is aware that Wget's code quality is poor, and
On 08/07/2012 10:24 AM, Fernando Cassia wrote:
> On Tue, Aug 7, 2012 at 12:12 PM, illusionoflife
> wrote:
>> I really want to contribute to Wget, but I am afraid that such radical
>> changes
>> will not be accepted with
>
> I´m just a lurker on this list, and have no relation whatsoever with
> t
The wget manpage is generated automatically from the texinfo-format
manual, so patches to the manual aren't usable.
As is typical for GNU software, the manpage lacks a great deal of
information that is present in the full manual (the infopage, or the one
you find online). The manpage makes a point
On 07/27/2012 07:02 AM, Hongyi Zhao wrote:
> But, when I ssh'ed to the remote HPC of my university and then run the
> same command as mentioned0 above, I'll meet the following issue:
>
> -
> zhaohongsheng@node32:~> wget -c http://registrationcenter-
> download.intel.com/akdlm/irc_nas/2
On 07/23/2012 12:47 PM, Micah Cowan wrote:
> The po/Makevars issue still seems to be present; you'll need to use the
> workaround mentioned on this mailing list in order to get a working
> build. I tried both the bootstrap from current official sources (seems
> to break at the boo
There is now proper support in configure.ac to find your libmetalink
installation and so forth. The code İlim has added (and the original
concurrency code Giuseppe wrote) is not conditionally-compiled, so it's
not possible to build this version of wget properly _unless_ the proper
configure args ar
Presumably, there are a number of other new HTML5 tags whose attributes
should be getting checked as well.
-mjc
On 07/11/2012 09:37 PM, Timothy Gibbon wrote:
> Hello,
>
> wget currently ignores the tag used in HTML5 video/audio:
>
> http://www.w3.org/wiki/HTML/Elements/video
>
> Attached is a
On 07/09/2012 10:24 PM, Owen Watson wrote:
> Would --local-encoding=UTF-8 fix it?
Unlikely. IIRC, that changes how wget behaves in terms of deciding how
to translate non-ascii URLs (IRIs) on the command-line, and I think how
it saves non-ascii file names, but I don't believe it will modify file
co
On 07/09/2012 08:02 PM, Owen Watson wrote:
> I'm archiving a website that (according to FF) is UTF-8, and
> text/html; charset=iso-8859-1.
> When I look at the archived page in FF it shows text in ISO-8859-1,
> and text/html; charset=iso-8859-1, and there are various problems with
> the text (eg so
Yes, --post-data has been around for quite some time, so you should be
fine, at least as far as form-based data submission is concerned.
-mjc
On 07/06/2012 02:17 AM, Gargiulo Antonio (EURIS) wrote:
> Now I’ve another question for you.
>
> On our environment machine, we can upgrade only the wget
On 07/05/2012 06:21 AM, Gargiulo Antonio (EURIS) wrote:
> I'm working with wget but I have a problem trying to authenticate to a
> login page.
I'm not 100% sure I understood your original message, but it sounds to
me like you're trying to use --http-user and --http-passwd to log into a
page that u
On 06/22/2012 08:50 AM, illusionoflife wrote:
> On Thursday, June 21, 2012 13:39:07 you wrote:
>>> IIRC, that was to allow the URL-extraction portion of wget to be built
>>> stand-alone, so that it would create a tool that just extract URLs and
>>> spit them out, and not as part of some wget run.
>
On 06/21/2012 04:08 PM, John wrote:
> Hello.
>
> After looking at the manual for wget while online here
> "https://www.gnu.org/software/wget/manual/wget.html";, I created the
> following command to download it:
>
> wget --secure-protocol=auto --convert-links --page-requisites
> --append-output=c:
On 06/21/2012 01:33 PM, Micah Cowan wrote:
> On 06/21/2012 11:12 AM, illusionoflife wrote:
>> Hello, Free Hackers!
>>
>> Currently, I got idea to feed wget sources to GNU complexity tool
>> and try to simplify some of extremely long functions. During exploring,
&
On 06/21/2012 11:12 AM, illusionoflife wrote:
> Hello, Free Hackers!
>
> Currently, I got idea to feed wget sources to GNU complexity tool
> and try to simplify some of extremely long functions. During exploring,
> I found, that we have two independed implementations of *read_whole_line* in
> n
On 06/20/2012 11:19 AM, John wrote:
>
>
> "Jochen Roderburg" wrote in message
> news:<20120614153556.18525j321wre0...@webmail-test.rrz.uni-koeln.de>...
>> Zitat von Jochen Roderburg :
>>
>> > Search an archive for this mailing list. ;-)
>> > Windows test builds from recent development snapshot
On 06/18/2012 03:42 AM, Jan Engelhardt wrote:
> On Sunday 2012-06-17 22:33, Giuseppe Scrivano wrote:
>
>> Hi,
>>
>> please report these problems to the translation project[1], translation
>> files are not maintained by us but we just distribute them.
>>
>> Thanks,
>> Giuseppe
>>
>> 1) http://trans
On 06/16/2012 08:31 AM, Ángel González wrote:
> On 16/06/12 12:07, jjDaNiMoTh wrote:
>> Hi list,
>>
>> It's not a bug, but I don't find any other ml for this awesome project.
> Don't worry. It's the appropiate list.
>
>> I want to download files from a Web Server which hasn't the Range
>> support,
On 06/16/2012 02:43 AM, Giuseppe Scrivano wrote:
> Hi Micah,
>
> Micah Cowan writes:
>
>> At first, I assumed wget was using errno improperly. Imagine my
>> surprise, though, when running wget under a debugger, to find that at
>> the tail end of main(), exit() gets
On 06/15/2012 07:59 PM, Micah Cowan wrote:
> The close_stdout() thing from closeout.c is from gnulib. I'm going to
> leave it to the you, Giuseppe, to find out what's going on beyond
> that... IIRC, gnulib offers facilities for excluding certain modules, so
> that might be
The current development sources, when invoked like:
wget -O - micah.cowan.name >/dev/null
Provides output like:
--2012-06-15 17:40:29-- http://micah.cowan.name/
Resolving micah.cowan.name (micah.cowan.name)... 174.136.4.17
Connecting to micah.cowan.name (micah.cowan.name)|174.136.4.17|:80...
On 06/15/2012 08:12 AM, Priyadarsh Kankipati wrote:
> Hi
> I am trying to use wget to download files using windows command line. I am
> behind a proxy server. I see the following options
>
> -Y, --proxy=on/off turn proxy on or off.
> --proxy-user=USER set USER as proxy username
On 06/09/2012 11:39 AM, Ángel González wrote:
> On 08/06/12 18:26, hito...@mpi-inf.mpg.de wrote:
>> Hi,
>>
>> I have a problem when using --convert-links (-k) on a utf-8 encoded web page.
>>
>> How to reproduce is:
>>
>> wget -k --restrict-file-names=nocontrol
>> http://ja.wikipedia.org/wiki/%E3%81
On 06/08/2012 02:14 AM, Tim Ruehsen wrote:
> BTW, while having a look into doc/wget.texi, I found a section 'Exit status'
> tagged with 'man begin'.
> Why does this section *not* show up in the man page ? Is it a bug or a
> feature
> ?
This would appear to be a regression, since it does in fact
On 06/06/2012 03:08 AM, Fernando Cassia wrote:
> I think if you look at the source (log-in form) there' s a "session
> token" there, apparently handled via Javascript.
I don't know whether Javascript may be modifying the form or not, but
there are clearly several input items to the "login-form" fo
On 06/05/2012 12:47 AM, Mr Cracker wrote:
> hi all
> I want to add option to wget to able it download a file with multi
> connections like Axel.
> and now I am looking for any help or idea.
> thanks.
Thanks for you interest in this feature.
The functionality for doing that is being actively devel
The behavior you described, is because you haven't properly quoted the
*. The shell will interpret * first, and if there is even a single file
in the current directory, the shell will expand to that file (and any
others), BEFORE it calls wget. Make sure to quote the * properly (say,
-R "*", instead
On 05/03/2012 03:02 AM, Castet JR wrote:
> *** THE JOB SEEMS TO BE STUCKED ON THIS DAMNED TRAP
> www.cisco.com/web/fw/c/global_print.css ***
If I had to guess, you're slamming their system too hard, with too many
requests, so they stop sending to your IP for a while. Try your wget
runs with -
In both cases, your shell is transforming your arguments before wget
gets a chance to see it.
Don't percent-encode ampersands - they need to be literal ampersands in
order to maintain their function as separating key/value pairs.
Instead, be sure to wrap the URL in double quotes ("), to protect y
On 04/27/2012 04:02 PM, z...@telia.com wrote:
>
>
>> Welcome to Windows' infamous DLL hell ;)
>
> Several years ago I read Gordon Letwin's book "Inside OS/2". There he speaks
> very, very
> highly of the concept of "Dynamically Linked Libraries".
Well, surely DLLs are a vast improvement over
Welcome to Windows' infamous DLL hell ;)
On 04/27/2012 01:16 PM, Jeremy Nicoll - ml wget users wrote:
> If the DLLs need kept separately then it follows that neither set should be
> in a folder on PATH, and I suppose that neither curl.exe nor wget.exe should
> be either.
I'm not sure that follows
On 04/23/2012 02:30 PM, Ángel González wrote:
> On 23/04/12 19:15, Sasikanth babu wrote:
>> Hi all,
>>
>> I am working on the bug "https://savannah.gnu.org/bugs/?33838"; - A way to
>> turn off verbosity but still have a progress bar.
>> I think most of normal users may be interested about this
Probably in combination with -nd (no directories), -k (convert links)
and -E (adjust filename extensions).
-mjc
On 04/19/2012 08:27 AM, Tony Lewis wrote:
> You're looking for:
> --page-requisitesget all images, etc. needed to display HTML page.
>
> wget URL --page-requisites
>
> should gi
On 04/17/2012 09:16 AM, Ryan Rawdon wrote:
> Micah took a quick look over the source (or was previously familiar with it),
> and it sounds like there may be checks in place which should have prevented
> this, however I did look to confirm.
I misread; it first checks if the hostname matches, and
On 04/13/2012 01:44 AM, Tim Ruehsen wrote:
> Am Thursday 12 April 2012 schrieb Micah Cowan:
>> On 04/12/2012 01:23 AM, TeeRasen wrote:
>>> In main.c we have
>>>
>>> opt.progress_type = "dot";
>>
>> In C, a string literal is of typ
On 04/12/2012 03:13 PM, David H. Lipman wrote:
> From: "Ángel González"
>
>> On 12/04/12 18:23, David H. Lipman wrote:
>>> From: "David H. Lipman"
>>>
Is it possible to add; --trust-server-names
To the WGETRC file ?
>>>
>>> Nevermind. It wasn't in the version of my documentation
On 04/12/2012 01:23 AM, TeeRasen wrote:
> In main.c we have
> opt.progress_type = "dot";
In C, a string literal is of type char[] (which automatically transforms
to char*), not const char[] or const char* (even though one must still
not modify it. You're either compiling with C++ (a bad
On 04/10/2012 10:34 PM, illusionoflife wrote:
> Yes, you are right: I missed that perl module. 68/69 now.
> One stupid question: Theese tests are meant to be run by user,
> building from source or by developer?
Well, the more people running them, the better, but the main purpose for
them was for
On 04/10/2012 08:52 AM, Tim Ruehsen wrote:
> Meanwhile, I wrote a simple proof of concept (parallel dummy downloads using
> threads, dummy downloading of chunks, etc.).
> I am at the point where I want to implement HTTP-Header metalink (RFC 6249).
> I just can't find any servers to test with... ma
On 05/11/2012 12:10 PM, illusionoflife wrote:
> On Monday, April 09, 2012 09:50:00 PM Ángel González wrote:
>> Do you have perl installed?
> $perl --version
> This is perl 5, version 14, subversion 2 (v5.14.2) built for x86_64-linux-
> thread-multi
In addition to Perl, I believe there are a couple
On 04/05/2012 04:30 AM, liaohuanghe wrote:
> Hi,
> When I wanna wget a single file which name is
> "[LAC][Gintama][250][x264_aac][ch_jp][480P_mkv].mkv", it failed . It give
> inforamtions lik "No matches on pattern
> `[LAC][Gintama][250][x264_aac][ch_jp][480P_mkv].mkv' ".
> Is this a bug, I can'
On 04/04/2012 12:02 PM, Ángel González wrote:
> On 04/04/12 20:16, Gijs van Tulder wrote:
>> 1. You can match complete urls, instead of just the directory prefix
>> or the file name suffix (which you can do with --accept and
>> --include-directories).
>> 2. You can use regular expressions to do the
I had promised to write up some information about what is needed to
support concurrent downloads in Wget, with a particular focus on GSoC
project possibilities.
I did not manage to do this in a timely manner; I was in Utah for a
week, where I'd expected to have more time on my hands than I actuall
On 03/29/2012 11:23 AM, Giuseppe Scrivano wrote:
> Tim Ruehsen writes:
>
>> Hi,
>>
>> the wget man page says a timeout value of 0 means 'forever'.
>> Even if seldom used, 0 seems to be a legal value.
>
> it can't be a legal value. It means the value you are waiting for is
> immediately availabl
On 03/20/2012 07:44 AM, Micah Cowan wrote:
> On 03/20/2012 02:01 AM, Paul Wratt wrote:
>> so let me re-itterate others on the list:
>> It is possible for wget to get a true response to 206, but fail to
>> "seek to partial start", rather starting from 0. if file is
On 03/20/2012 02:01 AM, Paul Wratt wrote:
> so let me re-itterate others on the list:
> It is possible for wget to get a true response to 206, but fail to
> "seek to partial start", rather starting from 0. if file is of unknown
> length it may be added to end of current file
I'm having a little tr
On 03/20/2012 12:00 AM, Ray Satiro wrote:
> Actually it looks like there is a problem with some later versions.
>
> ---request begin---
> GET /fedora/releases/16/Fedora/i386/iso/Fedora-16-i386-DVD.iso HTTP/1.1
> Range: bytes=-2147483648-
> User-Agent: Wget/1.13.1 (mingw32)
> Accept: */*
> Host: m
On 03/19/2012 01:13 PM, JD wrote:
> I am sorry -
> Range requests??
> How can I see that when I run wget -c
> You're asking for info I am at a loss as to how to obtain.
Sorry, I was slipping into potential technical explanations. You don't
need to know what ranged requests are.
As long as y
On 03/19/2012 01:06 PM, Henrik Holst wrote:
> Considering that the failing file in question is 3.5GiB it's probably a
> signed 32-bit problem with the size and/or range in either wget or the
> server. Would be interesting to see the range requests done by your version
> of wget around tje.signed 32
On 03/19/2012 12:09 PM, JD wrote:
> I honestly do not recall where I downloaded it from.
> Also, I do not have build tools, build env on my win XP laptop.
>
>
> On Mon, Mar 19, 2012 at 12:32 PM, Micah Cowan wrote:
>
>> Binary packages aren't provided on the GNU web
On 03/18/2012 11:50 AM, Boris Bobrov wrote:
> В сообщении от Sunday 18 of March 2012 03:15:01 Micah написал:
>> On 03/17/2012 09:45 AM, Boris Bobrov wrote:
> Hello!
> I've noticed the task with adding concurrency to wget and was really
> happy to see that wget will soon get that feature - I needed
1 - 100 of 533 matches
Mail list logo