On Tue, Nov 23, 2010 at 6:41 PM, Daniel Pittman <dan...@rimspace.net> wrote:

> Ashley Penney <apen...@gmail.com> writes:
>
> > As an example of the kind of thing we're talking about we use a product
> > called Sonatype Nexus that relies on a bunch of on disk data in
> > /srv/sonatype-nexus/.  When installing the system for the first time (for
> > example, when the file{} containing the .war triggers) we would like it
> to
> > automatically put down a copy of /srv/sonatype-nexus/.  We obviously
> don't
> > want this drifting out of sync with the production data which is where
> the
> > issue is.  How do other people handle this?
>
> Package those data files yourself, if necessary including logic in the
> package
> to ensure that you don't overwrite valuable local changes.  Then use puppet
> to
> ensure that package is either 'installed' or 'latest'.
>

I suppose this is possible, but awkward.  An example of another application
is this horrible Java CMS that we use that writes numerous XML files of
random names all over the place during operation.  There's cache
directories, it constantly rewrites various bits of configuration xml files,
it spews logs all over.  Packaging something like that up in a way that is
functional is almost impossible.  When we want to reinstall/clone that
server we just copy the entire directory and then run Puppet to change a few
key XML files.  Something like that is difficult to package, and the files
that you would package change frequently due to patches and internal
development on top of the CMS.


>
> > Our options seem to be:
> >
> > * Nightly/hourly backups of production data to some location where Puppet
> >   can rsync/wget/shovel it out when needed.
> > * Some kind of process that real-time syncs directories to nfs storage.
> > * Erroring if the data is missing in some fashion when Puppet runs and
> relying on
> >   sysadmins to put it in place.
>
> ...or making it available as a puppet file server, and using puppet to put
> it
> in place.
>

In our experience that is almost unusable, speedwise.


>
> > We've talked through the options but they all have fairly significant
> > drawbacks.  My personal favorite solution would be some kind of daemon
> that
> > syncs data constantly and is capable of intelligently syncing the data
> back
> > to the node if it goes missing.  It could be potentially error prone but
> it
> > represents the least bad choice.
>
> You could potentially just use:
>
>    file { "/example":
>      source => 'puppet:///module/example', replace => false
>    }
>
> That will only put the file in place if it doesn't already exist.
>

Hmm, I always forget about replace => false.  I wonder if it has the same
awful speed penalties.  I think my issue with this is still the hassle of
constantly syncing the changing files back into Puppet.  That's why I was
looking for some kind of semi or fully automated syncing mechanism for
something like this.  It's mostly Java apps that are especially bad for
this.  Most open source software sticks data into a database or at least a
single easily dealt with directory.  Java explodes all over the place like
some kind of evil virus.

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Users" group.
To post to this group, send email to puppet-us...@googlegroups.com.
To unsubscribe from this group, send email to 
puppet-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/puppet-users?hl=en.

Reply via email to