Hi, 
I have a script that uses shells to run a wget command, in order to synch a
web site with a local folder. The web site is the source and the local
folder is the destination.
Currently it just blindly copies the website down to the local folder
without performing any checks. However, I would like to modify the
functionality to verify the local files and the server files are the same
size, to reduce traffic during the transfer process and the duration of the
script.
My connection to the web server is fast, but there are several (~100) GB of
data to be synched.. 
Anyone have any ideas ? 


regards, 
Conor 


--------------------------------------

If possible, how about sync via FTP? It would be much easier & faster to
compare timestamp and size.

Also, if you continue to post in HTML, the dinosaurs using antique email
clients will complain.

- Chris

_______________________________________________
Perl-Win32-Users mailing list
Perl-Win32-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to