On 01/22/2014 11:37 PM, Robert Stupp wrote:
Is there any documentation available which optimizations Hotspot can perform and what 
"collecting a garbage" object costs?
I know that these are two completely different areas ;)

I was inspecting whether the following code
     for (Object o : someArrayList) { ... }
would be faster than
     for (int i=0, l=someArrayList.size(); i<l; i++) { Object 
o=someArrayList.get(i); }
for RandomAccessList implementations. The challenge here is not just to track the CPU 
time spent creating & using the iterator vs. size() & get() calls but also 
track GC effort (which is at least complicated if not impossible due to the variety of 
GC configuration options).

For a long time, using a for with an index (if you are *sure* that it's an ArrayList) was faster.
With latest jdk8, it's not true anymore.
(and most of the time the iterator object is not created anymore at least with jdk7+).


For example:
- Does it help Hotspot to declare parameters/variables as "final" or can 
Hotspot identify that?

no, it's an urban myth.
You can test it by yourself, if you declare a local variable final or not the exact same bytecode is produced by javac. The keyword final for a local variable (or a parameter) is not stored in the bytecode.

BTW, final was introduced in 1.1 mostly to allow to capture the value of a variable to be used by an anonymous class, Java 8 doesn't require this kind of variable to be declared final anymore.

- When does Hotspot inline method calls in general and getters/setters 
especially?

In general, up to a depth of 10 by default and 1 for a recursive method.
Roughly, a method call is not inlined either if the call is virtual and can call too many implementations or if the generated assembly code will be too big.


I think such a piece of documentation (just what Hotspot can do in which release) would 
be really helpful when someone tries to optimize code - want to say: the question is: 
"Is something worth to spend time on or is this special situation handled by 
Hotspot?"

It never worth it.
Choose the right algorithms and shape your data to be easily consumed by your algorithms is enough.


-Robert

Rémi

Reply via email to