Ok, either I've completely misread wget, or it has a problem
mirroring SSL sites. It appears that it is deciding that
the https:// scheme is something that is "not to be followed".
For those interested, the offending code appears to be 3 lines
in recur.c, which, if changed treat the HTTPS schema
It seems that SSL sites aren't crawled properly, because wget
decides that the scheme is not to be followed. Offending code
appears to be limited to only 3 lines located in recur.c:
(version 1.8.1)
Line 440: change to
if (u->scheme != SCHEME_HTTP && u->scheme!= SCHEME_HTTPS
Line 449: c
Hello ...
... try using the
"-p, --page-requisites get all images, etc. needed to display HTML page"
option (and wget >=1.6)
Bye
On Sat, 29 Dec 2001, Robin B. Lake wrote:
> I'm using wget to save a "tick chart" of a stock index each night.
>
> wget -nH -q -O /QoI/working/CHARTS/$myday+OEX.h
I'm using wget to save a "tick chart" of a stock index each night.
wget -nH -q -O /QoI/working/CHARTS/$myday+OEX.html
'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24OEX&sid=0&o_symb=%24OEX&x=60&y=15&freq=9&time=1'
The Web site returns an image, whose HTML is:
http://cha
Thomas Reinke <[EMAIL PROTECTED]> writes:
> Neat...not sure that I really nkown enough to start digging to easily
> figure out what went wrong, but it can be reproduced by running the
> following:
>
> $ wget -d -r -l 5 -t 1 -T 30 -o x.lg -p -s -P dir -Q 500
> --limit-rate=256000 -R mpg,
"Jiang Wei" <[EMAIL PROTECTED]> writes:
> I tried to download a whole directory in a FTP site by using `-r -np'
> options, and I have go through some firewall
> via http_proxy/ftp_proxy. But I failed, wget-1.8.1 only retrieved the
> first indexed ftp file list and stopped working, while wget-1.5.
Jean-Edouard BABIN <[EMAIL PROTECTED]> writes:
> I found a little bug when we download from an deleted directory:
[...]
Thanks for the report.
I wouldn't consider it a real bug. Downloading things into a deleted
directory is bound to produce all kinds of problems.
The diagnostic message could
Edward Manukovsky <[EMAIL PROTECTED]> writes:
> Excuse me, please, but I've got a question.
> I cannot set retry timeout for 30 seconds by doing:
> wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list
For me, Wget waits for 30 seconds between each retrieval. What
version are you using?
[ Please mail bug reports to <[EMAIL PROTECTED]>, not to me directly. ]
Nuno Ponte <[EMAIL PROTECTED]> writes:
> I get a segmentation fault when invoking:
>
> wget -r
> http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html
>
> My Wget version is 1.7-3, the one w
10 matches
Mail list logo