Hi Antonino,
on Friday, 2005-06-03 at 20:55:43, you wrote:
> So you're actually trying to reuse even the compilation work performed on
> the 'first' (let's call it 'master') machine and avoid compiling on all
> the others when you do an "emerge --update world" for instance?

That was my idea, or rather that's how I understood someone whose name I
forgot seems to have done it. Makes sense IMHO.

> If there were such a script that could copy the binaries and the new
> files to all the other machines I would probably not trust it! :)

Why? The total size of the shell/Python/whatever-scripts a simple
"emerge foo" triggers is probably over a meg, and it usually runs just
fine. Thinking about it, some simple parsing of emerge's output should
do something useful already:

emerge $package |
sed -n '/^>>> Merging $package/,/^ \* / {s/^[^ ] //; p}' |
while read f; do scp $f $somewhere ; done

I wouldn't mind adding another 500 bytes of Perl there :)

> I'd try to automate as much as possible the update process, possibly
> by keeping sincronized the configuration files of all the machines (but
> this is to be done on a per-file basis!!) and/or triggering an "emerge foo"
> on the other machines as soon as you do an "emerge foo" on the master.
> I must admit that I see this process difficult to "understand" and to
> debug in case of errors or misbehaviours....

Yup. It's unlikely something should fail as long as all machines keep an
identical configuration, but glitches can still happen. So I'd have to
look through all the compilation logs...hm :-S We'll see.

cheers!
        Matthias

-- 
I prefer encrypted and signed messages.       KeyID: 90CF8389
Fingerprint: 8E 1F 10 81 A4 66 29 46  B9 8A B9 E2 09 9F 3B 91

Attachment: pgp15ZC6BiYlp.pgp
Description: PGP signature

Reply via email to