Can not agree more.
For repos further in the food chain (such as deploy, etc) this is even
worse.
Often we do not really need to build jdk repo.
We doing (quick) partial builds in Hudson to run tests continuously.
We also often need to build on "test" systems where JDK build env is not
set.
Getting all of JDK build env there will slow things down. Usually we
only upload what is needed to build specific deploy repo.
-igor
On 9/10/12 11:19 AM, Phil Race wrote:
A huge step backwards.
I don't want to have to clone or keep hotspot up to date
and I prefer using the 'RE' builds of hotspot to any I would create.
I have never found this fragile. Its worked well for over 10 years ...
-phil.
On 9/10/2012 8:36 AM, Jonathan Gibbons wrote:
Having to compile hotspot every time one creates a new repo seems
like a very significant step backwards. I can clone and build
langtools in 45 seconds.
$ time ( hg clone http://hg.openjdk.java.net/jdk8/tl/langtools ; cd
langtools ; lt.ant build-all-classes )
destination directory: langtools
...
BUILD SUCCESSFUL
Total time: 23 seconds
real 0m45.313s
user 0m23.909s
sys 0m12.461s
-- Jon
On 09/10/2012 07:20 AM, Magnus Ihse Bursie wrote:
On 2012-09-10 14:13, Alan Bateman wrote:
I think this is a great topic to discuss.
At least within Oracle then I think the majority of people do
partial builds in their local environment. When I say "partial
build" then I mean they build a subset of repositories, not all of
them. So folks working in the jdk repository tend to just build
that repository with everything else coming from whatever import
JDK they are using. Partial builds are of course very fragile and
frequently people have problems because they aren't using a
matching import JDK. Also periodically we have flag days, say
changes that impact langtools+jdk or hotspot+jdk and these usually
cause a bit of disruption until the changes get into a promoted
build and into the import JDK that people use. As to why people do
partial builds then it's probably a mix of issues including disk
space, time to do the first build, and of course it's just the way
that we've always done it.
In build-infra, there is currently a "somewhat partial build"
feature that is implemented like this:
1) You check out a "master forest", containing all repos. You only
need to do this checkout once, and you are not required to
pull/update it (unless you want to keep up with changes -- like on a
flag day).
2) For subsequent work, you can clone just a single repo (let's call
it "my_hotspot"), and doing some configure magic (currently a bit
complicated, but that can be rewritten to be simpler) which will
point to the "master forest" but replace the hotspot repo in the
master forest with "my_hotspot"..
3) The first time you build "my_hotspot", it will rebuild the whole
forest (but with hotspot replaced with my_hotspot). However, if you
never pull/update the master forest, this will be a one-time event.
Subsequent builds will see that nothing has changed in the master
forest, and only changes in my_hotspot will be recompiled.
The pros with this solution is:
1) help with flag days -- just pull/update the master forest and
rebuild.
2) you can always build the complete product.
3) you save some disk space by only cloning the complete forest once.
However:
1) you will still need disk space for build output for the complete
forest, for every single repo.
2) you will still need to recompile the whole forest for each new
single repo you clone.
Do you think this will make the solution not good enough?
I think it is possible and not too hard to change this solution
slightly, so you would only need to compile the master forest once,
and then new single repos can use that build output as well as using
the source code from the master forest. Though proper dependency
checking might be harder, and there will be semantic issues about
the meaning of things like "make clean" that can be surprising. Is
this something you think is worth pursuing?
To completely eliminate compiling the whole product at least once
seems much harder. I can't really see a good way of using "import
JDKs" the way they've been used in the old build system. I imagine
one possibility is to prepare a tar ball, including a complete
cloned forest (for a certain point in time) and a build directory
with build artifacts corresponding to that source code and your
platform(s) of choice, and have people download that as starting
point. It seems like a lot of work, though. Do you think that is
needed to get the new build system accepted?
/Magnus