I would like to ftp four files from a host. The files are in a large
directory, and are different such that wild cards won't do it. If I write
the following routine:
WGET -N -i Files.txt
where Files.txt is:
ftp://ftp.f-prot.is/pub/fp-def.zip
ftp://ftp.f-prot.is/pub/fp-def.asc
Title: Àâ´º½º - ¼ÖÁ÷ÇÑ ÇÁ·ÎÆ÷ÁîÀÇ ±â¾÷°ú ÀÎÀç°¡ ¸¸³ª´Â
±¸ÀÎ/±¸Á÷ Àü¹® À¥ ¸®Å©·çÆûçÀÌÆ®!
¾È³çÇϼ¼¿ä Ãë¾÷Àü¹®»çÀÌÆ®
Àâ´º½ºÀÔ´Ï´Ù. º» ¸ÞÀÏÀº Àâ´º½ºÀÇ »õ·Î¿î
°³Æí ¼Ò½ÄÀ» ¾Ë·Áµå¸®±â À§ÇØ º¸³»µå¸°
¸ÞÀÏÀÔ´Ï´Ù. º» ¸ÞÀÏÀº 2~3°³¿ù¿¡ 1ȸ
¹ß¼ÛµÇ¾îÁý´Ï´Ù. ¼ö½ÅÀ» ¿øÄ¡
Title: Àâ´º½º - ¼ÖÁ÷ÇÑ ÇÁ·ÎÆ÷ÁîÀÇ ±â¾÷°ú ÀÎÀç°¡ ¸¸³ª´Â
±¸ÀÎ/±¸Á÷ Àü¹® À¥ ¸®Å©·çÆûçÀÌÆ®!
¾È³çÇϼ¼¿ä Ãë¾÷Àü¹®»çÀÌÆ®
Àâ´º½ºÀÔ´Ï´Ù. º» ¸ÞÀÏÀº Àâ´º½ºÀÇ »õ·Î¿î
°³Æí ¼Ò½ÄÀ» ¾Ë·Áµå¸®±â À§ÇØ º¸³»µå¸°
¸ÞÀÏÀÔ´Ï´Ù. º» ¸ÞÀÏÀº 2~3°³¿ù¿¡ 1ȸ
¹ß¼ÛµÇ¾îÁý´Ï´Ù. ¼ö½ÅÀ» ¿øÄ¡
Ryan Daniels [EMAIL PROTECTED] writes:
The following command line causes a Segfault on my system:
wget -spider http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But the bug you've uncovered is real: you can
Thomas Reinke [EMAIL PROTECTED] writes:
Ok, either I've completely misread wget, or it has a problem
mirroring SSL sites. It appears that it is deciding that the
https:// scheme is something that is not to be followed.
That's a bug. Your patch is close to how it should be fixed, with two
Herold Heiko [EMAIL PROTECTED] writes:
But that's not the real issue here - why -i for input but not for others
? A consistent interface should allow something like --file-char=@
-@Rfilename -@Aotherfilename ecc., i.e. accept a filename everywhere a
option is allowed.
This is a neat idea,
Peter Gucwa @ IIS-RTP [EMAIL PROTECTED] writes:
option -k does not work in following call:
wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Robin B. Lake [EMAIL PROTECTED] writes:
In a prior posting, I asked about saving an image from a Web page
instead of just saving the information necessary to re-retrieve that
image. I was advised to try -p -k --html-extension
Using wget-1.8.1-pre2, I still don't see the image data saved
Brendan Ragan [EMAIL PROTECTED] writes:
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirtymonth=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent,
Ian Abbott [EMAIL PROTECTED] writes:
On 4 Jan 2002 at 12:22, Bastiaan Stougie wrote:
wget -P $LOCALDIR -m -np -nH -p --cut-dirs=2
http://host/dir1/dir2/
This works fine, except that wget does not follow all the urls. It
skips urls like:
A HREF=//host/dir1/dir2/filetext/A
Wow, I
Hrvoje Niksic wrote:
Thomas Reinke [EMAIL PROTECTED] writes:
Ok, either I've completely misread wget, or it has a problem
mirroring SSL sites. It appears that it is deciding that the
https:// scheme is something that is not to be followed.
That's a bug. Your patch is close to
11 matches
Mail list logo