Ok, how does this sound to you guys? I've setup a server called 'catalogs1' on my private network. It's cutoff from the public net. It's just a basic Windows Server. I've setup a 2nd drive and on that drive created a folder called 'catalogs'.
I've setup that 'catalogs' folder up as a share, and created a single user that can access it. Same username/pw that OpenBD runs it's service as. The share is accessible via '\\catalogs1\catalogs' What I'm going to do, is create two versions of each catalog. 'Master' and 'Local'. They will be named as 'example_master' and 'example_local'. Updates will be done to the Master catalog. I'm going to then created a scheduled task that runs on each webserver that will: 1. Check to make sure the 'master' catalog is available. (this way if we are doing a backup or take it offline it will just skip the entire process and get updated the next time around) 2. copy the 'master' catalog over as 'example_local_new' 3. Confirm the copy is successful 4. Delete the existing 'example_local' and rename 'example_local_new' as 'example_local' 5. Complete! This way, I can keep a master copy of each catalog isolated on a secured server and I can make backups without affecting the local copy on the webserver. Also, the webserver has the performance advantage of using a local copy. Thoughts? -- official tag/function reference: http://openbd.org/manual/ mailing list - http://groups.google.com/group/openbd?hl=en
