Hrvoje Niksic wrote:
>
> Thomas Reinke <[EMAIL PROTECTED]> writes:
>
> > Ok, either I've completely misread wget, or it has a problem
> > mirroring SSL sites. It appears that it is deciding that the
> > https:// scheme is something that is "not to be followed".
>
> That's a bug. Your patch
"Ian Abbott" <[EMAIL PROTECTED]> writes:
> On 4 Jan 2002 at 12:22, Bastiaan Stougie wrote:
>
>> wget -P $LOCALDIR -m -np -nH -p --cut-dirs=2
>> http://host/dir1/dir2/
>>
>> This works fine, except that wget does not follow all the urls. It
>> skips urls like:
>>
>> text
Wow, I had no idea
Brendan Ragan <[EMAIL PROTECTED]> writes:
> This is the problem i'm having with an older wget (1.5.3) when i
> enter the url
>
> 'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirty&month=dec'
>
> it goes
>
> Connecting to www.tranceaddict.com:80... connected!
> HTTP request se
"Robin B. Lake" <[EMAIL PROTECTED]> writes:
> In a prior posting, I asked about saving an image from a Web page
> instead of just saving the information necessary to re-retrieve that
> image. I was advised to try -p -k --html-extension
>
> Using wget-1.8.1-pre2, I still don't see the image data
"Peter Gucwa @ IIS-RTP" <[EMAIL PROTECTED]> writes:
> option -k does not work in following call:
> wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Herold Heiko <[EMAIL PROTECTED]> writes:
> But that's not the real issue here - why -i for input but not for others
> ? A consistent interface should allow something like --file-char=@
> -@Rfilename -@Aotherfilename ecc., i.e. accept a filename everywhere a
> option is allowed.
This is a neat id
Thomas Reinke <[EMAIL PROTECTED]> writes:
> Ok, either I've completely misread wget, or it has a problem
> mirroring SSL sites. It appears that it is deciding that the
> https:// scheme is something that is "not to be followed".
That's a bug. Your patch is close to how it should be fixed, with
Ryan Daniels <[EMAIL PROTECTED]> writes:
> The following command line causes a Segfault on my system:
>
> wget -spider "" http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But the bug you've uncovered is real: y
Title: Àâ´º½º - ¼ÖÁ÷ÇÑ ÇÁ·ÎÆ÷ÁîÀÇ ±â¾÷°ú ÀÎÀç°¡ ¸¸³ª´Â
±¸ÀÎ/±¸Á÷ Àü¹® À¥ ¸®Å©·çÆûçÀÌÆ®!
¾È³çÇϼ¼¿ä Ãë¾÷Àü¹®»çÀÌÆ®
Àâ´º½ºÀÔ´Ï´Ù. º» ¸ÞÀÏÀº Àâ´º½ºÀÇ »õ·Î¿î
°³Æí ¼Ò½ÄÀ» ¾Ë·Áµå¸®±â À§ÇØ º¸³»µå¸°
¸ÞÀÏÀÔ´Ï´Ù. º» ¸ÞÀÏÀº 2~3°³¿ù¿¡ 1ȸ
¹ß¼ÛµÇ¾îÁý´Ï´Ù. ¼ö½ÅÀ» ¿øÄ¡ ¾ÊÀ¸½Ã¸
Title: Àâ´º½º - ¼ÖÁ÷ÇÑ ÇÁ·ÎÆ÷ÁîÀÇ ±â¾÷°ú ÀÎÀç°¡ ¸¸³ª´Â
±¸ÀÎ/±¸Á÷ Àü¹® À¥ ¸®Å©·çÆûçÀÌÆ®!
¾È³çÇϼ¼¿ä Ãë¾÷Àü¹®»çÀÌÆ®
Àâ´º½ºÀÔ´Ï´Ù. º» ¸ÞÀÏÀº Àâ´º½ºÀÇ »õ·Î¿î
°³Æí ¼Ò½ÄÀ» ¾Ë·Áµå¸®±â À§ÇØ º¸³»µå¸°
¸ÞÀÏÀÔ´Ï´Ù. º» ¸ÞÀÏÀº 2~3°³¿ù¿¡ 1ȸ
¹ß¼ÛµÇ¾îÁý´Ï´Ù. ¼ö½ÅÀ» ¿øÄ¡ ¾ÊÀ¸½Ã¸
I would like to ftp four files from a host. The files are in a large
directory, and are different such that wild cards won't do it. If I write
the following routine:
WGET -N -i Files.txt
where Files.txt is:
ftp://ftp.f-prot.is/pub/fp-def.zip
ftp://ftp.f-prot.is/pub/fp-def.asc
ftp://ftp.f-pr
11 matches
Mail list logo