When https_proxy is set, wget 1.8.2 tries to pass the GET string to
the proxy. This is an incorrect behavior; secure proxies expect a CONNECT
host:port request per RFC 2817.
I am unfamiliar with the wget source tree, but am going to look into
writing a patch.
--
Gus
"When th
I have found that -k option does not work on downloaded ftp files.
The key problem seems to be that register_download is never called on
ftp files downloaded as local_file is never set for calls to ftp_loop
like they are on calls to http_loop.
So, I added local_file as a parameter to ftp_loop an
It's already fixed in CVS for 1.9
Max.
Ivan A. Bolsunov <[EMAIL PROTECTED]> wrote:
> version: 1.8.1
> in file: html-url.c
> in function:
>
> tag_handle_meta()
> {
> ... skipped ...
> char *p, *refresh = find_attr (tag, "content", &attrind);
> int timeout = 0;
>
> for (p = refr
Hi,
I am using wget to fetch a tree of html pages. There is one page in the
tree, which is quite huge and which I don't need. How can I exclude this
single HTML file explicitly from being fetched? The --reject accepts
only filename-extensions.
Thank you
Matthias
wget - the new way of surfing!
ROMNEY Jean-Francois <[EMAIL PROTECTED]> wrote:
> I can't download files with wget 1.8.1 by using wildcards.
> Without wildcards, it works. The option "--glob=on" seems to have no
> effect. The command is :
> wget -d --glob==on -nc
> ftp://--:---@;ftpcontent.mediapps.com/francais/journal/e
version: 1.8.1
in file: html-url.c
in function:
tag_handle_meta()
{
... skipped ...
char *p, *refresh = find_attr (tag, "content", &attrind);
int timeout = 0;
for (p = refresh; ISDIGIT (*p); p++)
... skipped ...
}
BUG description:
find_attr() MAY return NULL, but this NOT chec