I dont really understand where you are coming from, its easy enough for the script to recompile its external dependancies on install. This way I don't really care what core version of linux or solaris or BSD is installed on each platform since we do not control them all.

Precisely _because_ they are not well managed is why its important for our environment to build itself cleanly. No, not the core os etc but netpbm, imagemagick, ripmime and other tools get rebuilt on a 'major' config.

I guess in your book we suck either way eh? Life goes on :)

John

On Thu, 31 Oct 2002 13:02:20 -0800 (PST)
Ask Bjoern Hansen <[EMAIL PROTECTED]> wrote:
On Wed, 30 Oct 2002 [EMAIL PROTECTED] wrote:

I don't believe in transfering _any_ binaries around,
every binary recompiles on its new platform at install
time. All modules, apache, external software etc. This
eliminates those pesky little problems that pop up when
you start pushing binaries.
Uhmn, if your systems are well managed you don't get any of those
"pesky little problems". Do you recompile the base system on each
server too?

In my experience, then as soon as you have more than a few handfuls
of servers you get more trouble trying to coordinate recompiling
binaries than doing it once and distributing them (in tarballs,
rpms, or with your revisioning system).


- ask

--
ask bjoern hansen, http://www.askbjoernhansen.com/ !try; do();



Reply via email to