Hi everybody: This is my first time to do this. I want to download a site to my computer.And the situation is described as below 1 My Internet connection is time limited.I can't get on line at night. 2 Even though the power supply will be cut off at night.So I can't keep my computer always on. 3 The connection to that site is not very stable,and it's delay is a bit long. 4 I tried to download that site like this: wget -r -T 10 -c -S http://some.site. My problem is :after days retrieving,everytime I start wget.Wget always spend lots of time to check whether these exist files are retrieved or updated.Even transmission time are much shorter than checking. As time goes by,due to the large amount of files on that site ,I'm afraid wget will not transmit file but only check exist files. Is there any trick to avoid that situation?
My second problem is: I want to write a script to download in multitheard download,and avoid race condition? Can anyone help me? Thank everybody Luo Mengyu! ___________________________________________________________ 抢注雅虎免费邮箱3.5G容量,20M附件! http://cn.mail.yahoo.com