Hi list,Try this, you may need to do little work but the out come will be outstanding.
This question had been put a hundred times before,yet am asking it once coz all other answers coundn't give me a great solution.
Can anyone provide me better way| script to sync local website files with that of remote server's over ftp.
TIA
--arky
wget -UX -x -c -m -nH --http-user=someuser --http-pass=somepass <url here>
url does not matter can be ftp of http or https(obviously you need ssl support),
Well, with https/http you will not get your scripts get dwonloaded properly.
If you want that get done then with in your remote webserver set up an alias for the directory to be mirrored with apache configuration to force it to not to run the scripts using the following options. and make sure you have some kind of authentication enabled to access that place. Otherwise will any one get to the source for that file.
For example if you have your site at in remote machine with this configuration
Local directory: /var/www/myhome
and it is set to use alias like Alias /myhome /var/www/myhome
and if you access it like this http://www.example.com/myhome/
Then set up this
<Directory /var/www/myhome> AuthUserFile "/usr/local/apache/private/password" AuthName "***Special Web Access***" AuthType Basic </Directory>
Alias /websrc-myhome /var/www/myhome
<Location /websrc-myhome>
<Limit GET PUT POST DELETE PROPFIND PROPPATCH MKCOL COPY MOVE LOCK UNLOCK>
Require valid-user
</Limit>
# This is important and no script will be run for this location you will get the files as-is
ForceType text/plain
</Location>
PS:
--cut-dirs to cut the directories
for example if you have http://www.example.com/myhome/home/web
--cut-dirs=1
then the resultant locally stored directories will be home/web with -nH option otherwise will look like this www.example.com/home/web
That is it. You can use this technique to use with any other tool including curl, lftp, ftp, or any other damn good clients, I personally use wget because of its ease of use and no cumbersome command line switches, but curl is very good at maintaing cookies and accessing urls using post method(well it is suitable for testing a development process not for mirroring).
Regards -Chandra
------------------------------------------------------- This SF.net email is sponsored by: The Robotic Monkeys at ThinkGeek For a limited time only, get FREE Ground shipping on all orders of $35 or more. Hurry up and shop folks, this offer expires April 30th! http://www.thinkgeek.com/freeshipping/?cpg=12297 _______________________________________________ linux-india-help mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/linux-india-help
