Happy New Year - May this be one of your best.
If this e-mail has reached you in error and you wish to be removed
from this mailing list, CAREFULLY, follow the removal instructions
below.
Just one click on the delete key and I am gone. Before you do that,
think about this: you can own one of th
While using wget to FTP a file over 2 gigs, I got an error saying it
did not support files over 2 gigs.
Is this support planned? Is this a bug? I'm runing RedHat 7.2 which includes
large file support and even downloading to SGI's XFS filesystem. I can
use the other standard GNU utils to create
¾È³çÇϼ¼¿ä
¸ÕÀú Çã¶ô¾øÀÌ ¸ÞÀϵå·Á Á˼ÛÇÕ´Ï´Ù.. ºÒÇÊ¿äÇϽźеéÀº »èÁ¦Çϼ¼¿ä
ÃÖ½ÅCDÀ» Àú·ÅÇÑ°¡°Ý¿¡ ÆǸÅÇÕ´Ï´Ù
ÃֽŰÔÀÓ.°¢Á¾ÇÁ·Î±×·¥(±×·¡ÇÈ.ijµå.¸ÖƼ¹Ìµð¾î µî) ÃÖ½ÅÀ¯Æ¿¸®Æ¼.¼ºÀÎVCD.. µîµî
¿©·¯ºÐµéÀÌ ÇÊ¿ä·ÎÇϽô ½ÃµðÀ» ´Ù·®º¸À¯ÇÏ°í ÀÖ½À´Ï´Ù
½Å¿ëÀº ¹°·Ð 100ÇÁ·Î ¹ÏÀ¸¼Åµµ µË´Ï´Ù
÷ºÎµÈ ÈÀÏ(cd ¸ñ·Ï.
Hello bug-wget,
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirty&month=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent, awaiting response... 302 Found
Location: