On 10/09/2012 15:20, Magnus Ihse Bursie wrote:
:

In build-infra, there is currently a "somewhat partial build" feature that is implemented like this: 1) You check out a "master forest", containing all repos. You only need to do this checkout once, and you are not required to pull/update it (unless you want to keep up with changes -- like on a flag day).

2) For subsequent work, you can clone just a single repo (let's call it "my_hotspot"), and doing some configure magic (currently a bit complicated, but that can be rewritten to be simpler) which will point to the "master forest" but replace the hotspot repo in the master forest with "my_hotspot"..

3) The first time you build "my_hotspot", it will rebuild the whole forest (but with hotspot replaced with my_hotspot). However, if you never pull/update the master forest, this will be a one-time event. Subsequent builds will see that nothing has changed in the master forest, and only changes in my_hotspot will be recompiled.

The pros with this solution is:
1) help with flag days -- just pull/update the master forest and rebuild.
2) you can always build the complete product.
3) you save some disk space by only cloning the complete forest once.

However:
1) you will still need disk space for build output for the complete forest, for every single repo. 2) you will still need to recompile the whole forest for each new single repo you clone.

Do you think this will make the solution not good enough?

I think it is possible and not too hard to change this solution slightly, so you would only need to compile the master forest once, and then new single repos can use that build output as well as using the source code from the master forest. Though proper dependency checking might be harder, and there will be semantic issues about the meaning of things like "make clean" that can be surprising. Is this something you think is worth pursuing?

To completely eliminate compiling the whole product at least once seems much harder. I can't really see a good way of using "import JDKs" the way they've been used in the old build system. I imagine one possibility is to prepare a tar ball, including a complete cloned forest (for a certain point in time) and a build directory with build artifacts corresponding to that source code and your platform(s) of choice, and have people download that as starting point. It seems like a lot of work, though. Do you think that is needed to get the new build system accepted?

/Magnus
I've had to track down several issues over the years with partial builds that were caused by someone using the wrong import JDK. So in one sense I look forward to us having less issues. My personal view is that those that only build the jdk repository will need to get over this. If the initial build is fast and subsequent incremental builds are super fast then it might be okay. I'm much less sure about those that only build hotspot or only build langtools today, the benefits are less clear. Also for those building hotspot then it may be that need to test with libjvm dropped into a different JDK image, I think that is what people typically do today.

-Alan.

Reply via email to