I'm working on building a fabric deployment for a project with a fairly complex environment. There will be 2+ app servers running django code, a few database servers running in some kind of master/ slave configuration, a rabbitmq server, a mail server, some load balancers, etc. There will be over 12 different machines at least. My question is how should I write fabric scripts for all these servers? I realize this isn't a fabric group, but I imagine there are a lot of people here who have used fabric extensively, and fabric doesn't have its own group...
Right now I plan on having a separate repository for both server config files and deployment scripts, and another separate repo for the django application code. The deploy repo will have a 'deploy' folder, and then inside that folder there will be an 'app' folder, containing the fabfile for apt-get installing, as well as pip installing all requirements for the app server. I'll also have a 'db' folder which will contain a fabfile with a task for apt-get installing postgres, a task for copying over postgres config files, etc. Eventually, I'll have a separate folder for each type of server, with it's own fabfile. Is this the right way? Does it make much sense to have it split up like that? Would it be easier to have one fab file for all different servers? -- You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com. To unsubscribe from this group, send email to django-users+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/django-users?hl=en.