Hi Maczka,

Maczka Michal wrote:

Why would you like to ever recompile java sources when you have jars?

I can think of a few reasons to do it:


a) You've made a (necessary) modification to the sources (security patch, bug fix, performance improvement, new feature, etc.). If you've got the sources, you might as well use them ;)

b) The JAR doesn't meet your needs, but can be built to meet your needs with some specific build properties, like enabling the use of some feature due to known presence of some other artifact. Ant does this a lot, for example, afaik.

c) The 'default' JAR is too big, you just want the core functionality. Think of embedded systems, for example.

d) The toolchain used in the build has a bug that shows up in the generated artifact. This happens ocassionally, I've been recently bitten by the compiler used to create Eclipse's JARs generating bytecode that violates the Virtual Machine Specification 2nd Edition.

e) You want to insert additional steps in the build process to verify the integrity of the result. External verifiers, static analysis tools, etc.

f) Licensing issues. The JAR may contain code that you don't want (or can't) distribute, but which can be stripped out. Think of (L)GPL code being avoided in JARs on Apache.org.

g) Adding debugging information. Obvious, I guess.

h) Targetting a different JDK version. Like building applets for use with JDK 1.1 runtimes with a 1.4.2 SDK.

i) Targetting a 64bit or 32bit system (Javac in JDK 1.5 has switches for this, but I don't know for sure what they do).

And so on.

I always that "the latest proven trend" is to distribute appliction as byte
code and to let VM do optimisation.
At least this is what Microsoft is doing with .NET and soon (in few years)
most of the application for MS Windows
will be run using CLR VM. IMHO this is a way to go and what Gentoo Linux
does is simply for people who like to have a hobby and have a lot of time for it.

I see it as a 'culture clash' between pragmatism and idealism.


The Windows culture does not see much of a point in having source code available, the value is in the binaries. If things don't work, you wait for the vendor to fix them, use something else, or cry. I'd assume that open source is seen as a cute way to collaborate to produce better binaries faster. Binaries work most of the time, so tinkering with source is a waste of time.

The free software culture does not see much point in having binaries, the value is in the sources. Given the free availability of tools to rebuild the artifacts and the free availability of associated build scripts, it is often easy to do the build yourself (on systems like Gentoo or Fink, that are deeply rooted in the 'source code is more useful then binary philosophy', your experience may vary on other systems). Binaries work most of the time, but that's just a nice byproduct of people spending time tinkering with the sources. You're encouraged to do the tinkering, to learn and and to adapt the progams to your needs.

The pragmatic approach is great when you need something fast. The idealistic approach is great when you need something that can be customized according to your needs and can grow along with your needs. Both approaches have their value, despite the ocassional zealous flame-war on newsgroups. ;)

On the other hand, Gentoo is not unique with the 'everything from source' concept. Other successful projects, like Fink (which brings a lot of value to the Unix part of MacOSX), do the same. Nor do those types of systems remain purely devoted to providing just the sources. Fink offers many pre-built binaries for easy download for those users that don't want to build the binaries themselves. I think Gentoo does that too, for the base packages, at least.

So systems like Debian, Gentoo, or even Fink don't assume that one approach will fit all, but provide a combination of the pragmatic and the idealistic approach: binaries for those that want them, sources for those that want them.

Some people are still believing that compilation to native code can increase
the performance.
This is simply not true.


Some of the reasons are nicely explained here:
http://www.idiom.com/~zilla/Computer/javaCbenchmark.html

Actually, it shows that compilation to native code *can* increase the performance. It *doesn't necessairly always have to*, but it can, depending on the code, compilers, etc. It shows that for some applications, Sun's JDK can run equivalent code faster than a native counterpendant. For others, it still can't. Maybe it will one day, though, I don't know.


OTOH, some people still prefer jikes to javac because of speed. ;)

If you want to "recompile" for other reason - to improve start-up time of
JVM please note that:

a) start-up of JVM time it's getting better and better and with shared
instnace of JVM it will be also improved

Sure. Native compilers also keep getting better and better. We all win, one way or another. ;)


b) more and more of the modern applications ( e.g eclipse, idea, maven) are
distributed as small kernels which are loading plugins. And AIIK the compilation to native code can't help much in such cases.

Not being a gcj developer myself, I unfortunately don't know how well it copes with that sort of issues. You may want to ask on the gcj developer mailing lists.


cheers,
dalibor topic


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to