On 15.02.2013 18:22, Akom The Benevolent wrote:
In my company binaries for deployment are released into an FTPS repo
(and are not packages).   The simplest option of making them available
via *puppet:///* urls is to constantly sync the FTPS contents to the
modules directory on the master, but the volume of binaries is
ever-growing and only a small percentage is actually used from the
catalogs for any specific deployment, so that's not ideal.

I'm starting to feel like I'm heading down an incorrect path altogether
- would the experts be able to offer some direction?

You are currently in a situation where you have neither control nor knowledge about updates on the ftps site. This is bad. Any solution on your side can only be a hack.

That said, I would try to sync the complete ftps to the master asynchronously and serve the binaries from the sync destination. That would mean a slight delay from the time the binaries are published to the time the binaries are available, but catalog compiles and agent runs are not affected.

Next step up would be to sync-then-package. This way you get local control about which binaries are deployed and have all the goodness of your native packaging format.

Finally, of course the best solution would be a proper upstream package repository instead of a heap-of-binaries.


Best Regards, D

--
You received this message because you are subscribed to the Google Groups "Puppet 
Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to puppet-dev+unsubscr...@googlegroups.com.
To post to this group, send email to puppet-dev@googlegroups.com.
Visit this group at http://groups.google.com/group/puppet-dev?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to