[...]
> > but as a first step we could
> > provide script which would download and verify all necessary sources.
>
> That (or a 'make populate' target) is probably needed anyway for someone
> who might want to setup their workspace when they have a fast network
> connection but build later when disconnected - wouldn't
> want it to fail then. And for critical build milestones the option
> to be completely disconnected from the network and have the build
> succeed is needed.
If we will standardize how products will download their sources, it
should be doable.
> > In the future it would allow us to simply build only relevant
> > products (for building gtar you would not have to download and
> > compile gtk libraries for example).
>
> well true, with teamware we do have that via partial bringovers. We
> don't have that with hg but I wasn't yet worried about that (and it's
> only the disk space, you don't have to build the gtk libraries _now_
> if you just want to build gtar. Now if you're talking about being
> able to run nightly and only have it build gtar, well that's not
> what I'm thinking of since nightly's audits will be kinda pointless
> without a fully built workspace).
I was not clear there. My intention was to build only gtar (+ libraries
if it needs some) and only download gtar (+ libraries) sources. On the
other hand, you are supposed to run full nightly, so you have to
download all the sources at the end of the day.
> > a) new repository will not be related to old one, so if someone has
> > some local changes to his clone, hg would not be able to merge his
> > changes automatically.
>
> It doesn't seem to me that there's a lot of intersection between
> people changing the same files in sfw compared to say, ON, except
> in a very few places like {pkgdefs,lib,cmd}/Makefile, and Targetdirs.
> Well unless I'm around but I could stop changing things :)
Hey, that's not exactly what I proposed :)
> Which aren't that hard to merge, and we could if necessary change it so
> you are less likely to have to touch those files.
>
> So I wasn't really worried about that though perhaps it's a larger
> issue than I thought.
Merging by hand would not be a rocket science, just another little
obstacle. Not a showstopper for sure.
> > b) we would either loose history during the transition, or we would have
> > to use the 'convert' extension and handpick every 'old' source. (granted
> > it could be scripted to some degree)
>
> but the history would already be in the old repository. It seems just
> like having to look through the various release gates to me, so again
> perhaps I'm too close and this is a larger issue than I believe.
You would have to have login access to machine with all the
repositories, or you would have to clone them locally to do that. That
might be a lot of work just to find out who, when and why disabled that
option to configure. Just another small obstacle.
> >> If there are issues doing it this way then I am quite open to not doing
> >> this transition now and allowing others to do it, so feel free :) I'm
> >> just feeling pressure on this so thought I'd give it a shot, but I'm
> >> not in a hurry there are bugs to fix as well.
> >
> > Where is the pressure coming from ?
>
> every few days I get 'Mike, when are you transitioning sfw to hg? What
> build can I put down?' From upper management (internally of course).
That's good, that means we have their support for getting resources,
don't we ? :) Just out of curiosity, any idea why they want to do that ?
> > If it's because we want to give
> > community the possibility to contribute, we should first create proper
> > tools for it first. I can hardly imagine 1G of sources just to bump up
> > gtar revision ...
>
> If that's what you're envisioning you may be arguing for something
> else,
Yes, I have in my mind some other improvements also, but this thread was
about splitting sources from the gate mainly, which IMO would have
benefits in the future.
> like breaking up the workspace into smaller chunks. Which perhaps
> might be good, or perhaps not (I tend to like being able to build it
> all and run the audits but if the tools were changed that could be
> done with a meta-nightly too).
That also came across my mind, but I know too little about the gate so I
might be pipe-dreaming easily.
> And I understand the build time and space concerns, but please
> remember I'm old and come from the days when gatekeepers were forever
> having to yell at people for breaking the full build because they
> didn't think they had to do one :)
That's the experience I don't have, and that's why I feel so optimistic
:)
> Even for things like gtar, where you might not think it's worth doing
> even one full build, I'd say that yeah it might not catch anything you
> wouldn't catch building just gtar by hand (or it may actually find
> some things you forgot to package) but if nothing else it will run
> gtar a lot and help your testing :)
That makes sense to me.
> > I would be happy to help. But first it would be good IMO to define what
> > is our goal.
>
> My goal is, as usual, to have people stop yelling at me. But I never
> seem to succeed :)
Haha :) At the end of the day it's just about having roof over your head
and something to eat, isn't it ? :)
--
Vlad
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 193 bytes
Desc: not available
URL:
<http://mail.opensolaris.org/pipermail/sfwnv-discuss/attachments/20080918/2bbc6fd8/attachment.bin>