I am managing a fairly small set of machines (network security
monitors) and some of these packages are being installed on just two
or three boxes so spending a lot of time building packages is simply
not worth it. The apps are also updated fairly frequently and I need
to stay on the bleeding edge :(
I'm firmly of the opinion that time you invest now in automating
package creation as much as possible will repay itself very quickly
and be of immense value over time.
I think this is even more important when you have a lot of churn.
Simply being able to flip between versions of the packages quickly
will surely save an awful lot of time.
What he said - really. Three immediate benefits:
- having the build process entirely scripted in the spec file ensures a
consistent build process over time.
- the first time you deploy a new version that explodes, rolling back to
a set of known good binaries is utterly trivial to do in moments.
- as sensitive boxes (security monitors) it's very useful to be able to
easily verify the software binaries, using either the local rpm database
or a copy of the original rpm. for a little extra, you can pgp sign your
rpm files too.
I'll throw out what is likely an unpopular opinion in this forum, which
is that package-building is not *always* a win. Russell and I both use
puppet to manage network security sensors which have some mildly unusual
properties:
1) The packages being built are sometimes unusually complex to package,
have poorly designed build processes, and small/closed user-communities.
I have one package that builds kernel-modules and writes outside of
the build-directory during make, and that nobody outside of the security
community uses. Every shop has a few packages like this that would
require a solid week of time to package properly, plus ongoing
maintenance to track upstream changes.
2) Core libraries (specifically libpcap) often have to be rebuilt.
3) The number of boxes under management can be very small. I monitor a
50k node network with 3 boxes.
4) The software builds on these boxes are extremely stable. Nobody is
flipping back and forth between versions on a regular basis.
So (1) and (2) make the cost of packaging higher than most shops for
security teams, and (2) and (3) make the benefits lower. I have been
packaging my sensor software for about 18-months, and had high-hopes
that it would eventually save me time but it's always worse than a
source deployment by a large margin. I accept the overhead mostly
because it improves my time-to-recover from a serious problem, and there
are other benefits. It's time invested in maturity, not a time-savings,
though.
I use mock for my build process, and createrepo to manage the rpms. The
infrastructure isn't actually bad at all. The time comes in writing and
rewriting spec-files. You can sometimes poach work from CentOS of
Fedora, but some sensor-software has simply never been packaged and you
have to start from scratch.
So Russell, the short of it is that Puppet doesn't provide much to help
you manage source-installed software. You can apply puppet's features
to other software-management tools to roll something yourself, you can
package the software, or you can just keep building from source. The
last option is likely to be the least time-consuming IMO. If you want
details on my packaging setup, feel free to reach out to me offline.
Cheers,
Mike Lococo
--
You received this message because you are subscribed to the Google Groups "Puppet
Users" group.
To post to this group, send email to puppet-users@googlegroups.com.
To unsubscribe from this group, send email to
puppet-users+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/puppet-users?hl=en.