Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread cowwoc

Something very weird is going on. I tried profiling a minimal testcase and
there is a considerable amount of "missing time". I am using a dev build of
Netbeans 6.1 and it says:

MyComparator.compare(Object, Object) 19670ms
\-> MyComparator.compare(Integer, Integer) 10229ms
  \-> Self Time 3001ms
  \-> Integer.compareTo(Integer) 1575ms
\-> Self Time 3788ms

I spot at least three problems:

1) The individual item times do not add up to the total (but they do for
other stack-traces).
2) Comparator.compare() self-time consumes more CPU than Integer.compareTo()
even though it only invokes a method while the latter does actual
computation.
3) Why is extra time consumed moving from MyComparator.compare(Object,
Object) to (Integer, Integer)? It looks like Generics is doing something at
runtime which consumes a large amount of cpu.

Gili


Clemens Eisserer wrote:
> 
> Hi cowwoc,
> 
>> I guess you're right. It is probably as likely that the JIT will optimize
>> away the null check as it is that it will optimize away the
>> NullPointerException check. One exception, though, is when production
>> systems run using -Xverify:none. In such a case, wouldn't my approach run
>> faster?
> I don't think it will optimize the null-check away, however it is so
> cheap that it most likely will not weight at all, compared to all the
> other operations happening there. Its maybe 5 instructions compared to
> thousands or even more.
> -Xverify:none only disables bytecode verification at class-loading
> time and has no influence (as far as I know) on the performance of the
> generated code.
> 
>> I still think that my proposed code is somehow more consistent/cleaner on
>> a
>> design-level but I guess that's just me :)
> I also like it more, its cleaner in my opinion :)
> 
>> As an aside, are there standard benchmarks for testing the impact of this
>> change? I'd love to know whether it actually produces any performance
>> difference in practice.
>>From my experience i would rather guess that you won't notice the
> change, noise will be higher.
> 
> lg Clemens
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Proposal-for-improving-performance-of-TreeMap-and-others-tp14673283p14679084.html
Sent from the OpenJDK Core Libraries mailing list archive at Nabble.com.



Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread Rémi Forax

Clemens Eisserer a écrit :

Hi cowwoc,

  

I guess you're right. It is probably as likely that the JIT will optimize
away the null check as it is that it will optimize away the
NullPointerException check. One exception, though, is when production
systems run using -Xverify:none. In such a case, wouldn't my approach run
faster?

I don't think it will optimize the null-check away, 

Hotspot  removes nullcheck and install a signal handler since its v2
(around 2000/01 If my memory serves me well).


however it is so
cheap that it most likely will not weight at all, compared to all the
other operations happening there. Its maybe 5 instructions compared to
thousands or even more.
-Xverify:none only disables bytecode verification at class-loading
time and has no influence (as far as I know) on the performance of the
generated code.
  
yes, and there is an option to remove nullcheck that is only available 
on debug VM.

...


lg Clemens
  

Rémi


Re: core classes still need to be declared final?

2008-01-07 Thread Martin Buchholz
Subclassability is a problem with "value-oriented" computing.

If security or extreme reliability is a concern, then
existing apis that took Integers or Strings as arguments would
have to make defensive copies on import or export, as they have
to do with arrays today.  Since existing classes depend on
the immutability of Integers and Strings, these must forever remain
non-subclassable, at least by untrusted application code.

Inheritance is the one cornerstone of object-oriented computing that
has disappointed us, now that we have gained experience with it,
since it seriously constrains the evolution of superclasses.
Prefer composition to inheritance.
Especially so with immutable "value" types.

For the particular case of range-restricted integers,
I have some sympathy.  It would be nice if the platform offered
such things.

Martin

Nick Radov wrote:
> 
> Is it still necessary for the core Java classes such as
> java.lang.Integer to be declared final? I understand that may have been
> necessary in the early days for performance reasons, but modern JVMs no
> longer provide much of a performance benefit for final classes. For
> certain applications it would really be helpful to be able to subclass
> some of those core classes.
> 
> For example, one application I'm working on deals with integer values
> that must be between 0 and  inclusive. I would like to be able to
> create a custom Integer subclass which enforces that limit in the
> constructor, but currently that isn't possible. While I could create a
> new class that acts as a wrapper around Integer, the syntax would be
> much more awkward and that would also make it much more difficult to
> interface with other third-party classes.
> 
> *Nick Radov | Research and Development Manager | Axolotl Corp*
> www.axolotl.com , d: 408.920.0800 x116, f:
> 408.920.0880
> 160 West Santa Clara St., Suite 1000, San Jose, CA, 95113


Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread Clemens Eisserer
Hi cowwoc,

> I guess you're right. It is probably as likely that the JIT will optimize
> away the null check as it is that it will optimize away the
> NullPointerException check. One exception, though, is when production
> systems run using -Xverify:none. In such a case, wouldn't my approach run
> faster?
I don't think it will optimize the null-check away, however it is so
cheap that it most likely will not weight at all, compared to all the
other operations happening there. Its maybe 5 instructions compared to
thousands or even more.
-Xverify:none only disables bytecode verification at class-loading
time and has no influence (as far as I know) on the performance of the
generated code.

> I still think that my proposed code is somehow more consistent/cleaner on a
> design-level but I guess that's just me :)
I also like it more, its cleaner in my opinion :)

> As an aside, are there standard benchmarks for testing the impact of this
> change? I'd love to know whether it actually produces any performance
> difference in practice.
>From my experience i would rather guess that you won't notice the
change, noise will be higher.

lg Clemens


Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread cowwoc

I guess you're right. It is probably as likely that the JIT will optimize
away the null check as it is that it will optimize away the
NullPointerException check. One exception, though, is when production
systems run using -Xverify:none. In such a case, wouldn't my approach run
faster?

I still think that my proposed code is somehow more consistent/cleaner on a
design-level but I guess that's just me :)

As an aside, are there standard benchmarks for testing the impact of this
change? I'd love to know whether it actually produces any performance
difference in practice.

Gili



Martin Buchholz wrote:
> 
> The authors of TreeMap have thought about
> eliding comparator null checks:
> 
> 
> /**
>  * Version of getEntry using comparator. Split off from getEntry
>  * for performance. (This is not worth doing for most methods,
>  * that are less dependent on comparator performance, but is
>  * worthwhile here.)
>  */
> final Entry getEntryUsingComparator(Object key) {
>   K k = (K) key;
> Comparator cpr = comparator;
> if (cpr != null) {
> Entry p = root;
> while (p != null) {
> int cmp = cpr.compare(k, p.key);
> if (cmp < 0)
> p = p.left;
> else if (cmp > 0)
> p = p.right;
> else
> return p;
> }
> }
> return null;
> }
> 
> As to whether using an explicit Comparator for the "natural ordering"
> is a performance improvement, this depends very much on
> the implementation of the JIT and the degree of polymorphism of
> the call site, and on the prevalance of TreeMaps using "natural
> ordering".  At the very least, a null check is very cheap, so it is
> unlikely that the proposed change will be a significant performance
> improvement, while, on the other hand, there is a good chance that
> it will decrease performance for TreeMaps using "natural ordering".
> 
> Aside: It's probably a good idea for the comparator for
> "natural ordering" to be available via some static method.
> 
> Martin
> 
> 
> cowwoc wrote:
>> I noticed that TreeMap (and maybe other classes) require a user to either
>> pass in a Comparator or ensure that all keys must implement Comparable.
>> The
>> TreeMap code then uses a utility method whenever it needs to compare two
>> keys:
>> 
>> 
>> /**
>>  * Compares two keys using the correct comparison method for this
>> TreeMap.
>>  */
>> final int compare(Object k1, Object k2) {
>>   return comparator == null ? ((Comparable) k1)
>>   .compareTo((K) k2) : comparator.compare((K) k1, (K) k2);
>> }
>> 
>> The problem with the above method is that it checks whether comparator is
>> null once per comparison instead of once when the TreeMap is constructed.
>> Instead I propose that this check only take place once in the
>> constructors
>> and the rest of the code assume that a comparator exists. If a comparator
>> is
>> not provided then you can simply define one as follows:
>> 
>> comparator = new Comparator()
>> {
>>   @SuppressWarnings("unchecked")
>>   public int compare(K first, K second)
>>   {
>> return ((Comparable) first).compareTo(second);
>>   }
>> });
>> 
>> This solution should be backwards compatible while improving performance.
>> At
>> least, that's my guess. There is always the chance that the JIT is smart
>> enough to optimize away this comparison but I'd rather not rely on JIT
>> implementation details. I also believe the resulting code is more
>> readable.
>> 
>> What do you think?
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Proposal-for-improving-performance-of-TreeMap-and-others-tp14673283p14676918.html
Sent from the OpenJDK Core Libraries mailing list archive at Nabble.com.



Re: core classes still need to be declared final?

2008-01-07 Thread cowwoc


	My understanding is that this has nothing to do with performance. 
Certain classes, such as String, as declared final for security reasons.


	In the case of Integer I would suggest using composition. It's not as 
nice but it'll work.


Gili

Nick Radov wrote:


Is it still necessary for the core Java classes such as 
java.lang.Integer to be declared final? I understand that may have been 
necessary in the early days for performance reasons, but modern JVMs no 
longer provide much of a performance benefit for final classes. For 
certain applications it would really be helpful to be able to subclass 
some of those core classes.


For example, one application I'm working on deals with integer values 
that must be between 0 and  inclusive. I would like to be able to 
create a custom Integer subclass which enforces that limit in the 
constructor, but currently that isn't possible. While I could create a 
new class that acts as a wrapper around Integer, the syntax would be 
much more awkward and that would also make it much more difficult to 
interface with other third-party classes.


*Nick Radov | Research and Development Manager | Axolotl Corp*
www.axolotl.com , d: 408.920.0800 x116, f: 
408.920.0880

160 West Santa Clara St., Suite 1000, San Jose, CA, 95113
 
THE MARKET LEADER IN HEALTH INFORMATION EXCHANGE – PROVIDING PATIENT 
INFORMATION WHEN AND WHERE IT IS NEEDED.
 
/The information contained in this e-mail transmission may contain 
confidential information. It is intended for the use of the addressee. 
If you are not the intended recipient, any disclosure, copying, or 
distribution of this information is strictly prohibited. If you receive 
this message in error, please inform the sender immediately and remove 
any record of this message./


core classes still need to be declared final?

2008-01-07 Thread Nick Radov
Is it still necessary for the core Java classes such as java.lang.Integer 
to be declared final? I understand that may have been necessary in the 
early days for performance reasons, but modern JVMs no longer provide much 
of a performance benefit for final classes. For certain applications it 
would really be helpful to be able to subclass some of those core classes.

For example, one application I'm working on deals with integer values that 
must be between 0 and  inclusive. I would like to be able to create a 
custom Integer subclass which enforces that limit in the constructor, but 
currently that isn't possible. While I could create a new class that acts 
as a wrapper around Integer, the syntax would be much more awkward and 
that would also make it much more difficult to interface with other 
third-party classes.

Nick Radov | Research and Development Manager | Axolotl Corp
www.axolotl.com, d: 408.920.0800 x116, f: 408.920.0880
160 West Santa Clara St., Suite 1000, San Jose, CA, 95113
 
THE MARKET LEADER IN HEALTH INFORMATION EXCHANGE ? PROVIDING PATIENT 
INFORMATION WHEN AND WHERE IT IS NEEDED.
 
The information contained in this e-mail transmission may contain 
confidential information. It is intended for the use of the addressee. If 
you are not the intended recipient, any disclosure, copying, or 
distribution of this information is strictly prohibited. If you receive 
this message in error, please inform the sender immediately and remove any 
record of this message.

Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread Martin Buchholz
The authors of TreeMap have thought about
eliding comparator null checks:


/**
 * Version of getEntry using comparator. Split off from getEntry
 * for performance. (This is not worth doing for most methods,
 * that are less dependent on comparator performance, but is
 * worthwhile here.)
 */
final Entry getEntryUsingComparator(Object key) {
K k = (K) key;
Comparator cpr = comparator;
if (cpr != null) {
Entry p = root;
while (p != null) {
int cmp = cpr.compare(k, p.key);
if (cmp < 0)
p = p.left;
else if (cmp > 0)
p = p.right;
else
return p;
}
}
return null;
}

As to whether using an explicit Comparator for the "natural ordering"
is a performance improvement, this depends very much on
the implementation of the JIT and the degree of polymorphism of
the call site, and on the prevalance of TreeMaps using "natural
ordering".  At the very least, a null check is very cheap, so it is
unlikely that the proposed change will be a significant performance
improvement, while, on the other hand, there is a good chance that
it will decrease performance for TreeMaps using "natural ordering".

Aside: It's probably a good idea for the comparator for
"natural ordering" to be available via some static method.

Martin


cowwoc wrote:
> I noticed that TreeMap (and maybe other classes) require a user to either
> pass in a Comparator or ensure that all keys must implement Comparable. The
> TreeMap code then uses a utility method whenever it needs to compare two
> keys:
> 
> 
> /**
>  * Compares two keys using the correct comparison method for this TreeMap.
>  */
> final int compare(Object k1, Object k2) {
>   return comparator == null ? ((Comparable) k1)
>   .compareTo((K) k2) : comparator.compare((K) k1, (K) k2);
> }
> 
> The problem with the above method is that it checks whether comparator is
> null once per comparison instead of once when the TreeMap is constructed.
> Instead I propose that this check only take place once in the constructors
> and the rest of the code assume that a comparator exists. If a comparator is
> not provided then you can simply define one as follows:
> 
> comparator = new Comparator()
> {
>   @SuppressWarnings("unchecked")
>   public int compare(K first, K second)
>   {
> return ((Comparable) first).compareTo(second);
>   }
> });
> 
> This solution should be backwards compatible while improving performance. At
> least, that's my guess. There is always the chance that the JIT is smart
> enough to optimize away this comparison but I'd rather not rely on JIT
> implementation details. I also believe the resulting code is more readable.
> 
> What do you think?


Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread Thomas Hawtin

cowwoc wrote:

I noticed that TreeMap (and maybe other classes) require a user to either
pass in a Comparator or ensure that all keys must implement Comparable. The
TreeMap code then uses a utility method whenever it needs to compare two
keys:


I'm not going to comment about performance, but there is a problem with 
serialisation.


TreeMap.comparator is final (and non-transient).

TreeMaps serialised with earlier versions will be deserialised with null 
comparator. So, comparator would either need to be made non-final or 
sun.misc.Unsafe used.


For the serialisation case, it would be necessary to change writeObject 
to use putFields rather than defaultWriteObject (not very nice, but not 
half as nasty as I originally thought).


Tom Hawtin


Re: Proposal for improving performance of TreeMap and others

2008-01-07 Thread Clemens Eisserer
> This solution should be backwards compatible while improving performance. At
> least, that's my guess. There is always the chance that the JIT is smart
> enough to optimize away this comparison but I'd rather not rely on JIT
> implementation details. I also believe the resulting code is more readable.
>
> What do you think?

>From the performance-overview theres no (real) difference I guess, but
I have to agree that the code is more readable and cleaner.
On the other hand its one more class that has to be shipped and loaded
at startup.
I like your approach, I just don't know the real pros and cons...

lg Clemens


Proposal for improving performance of TreeMap and others

2008-01-07 Thread cowwoc

I noticed that TreeMap (and maybe other classes) require a user to either
pass in a Comparator or ensure that all keys must implement Comparable. The
TreeMap code then uses a utility method whenever it needs to compare two
keys:


/**
 * Compares two keys using the correct comparison method for this TreeMap.
 */
final int compare(Object k1, Object k2) {
  return comparator == null ? ((Comparable) k1)
  .compareTo((K) k2) : comparator.compare((K) k1, (K) k2);
}

The problem with the above method is that it checks whether comparator is
null once per comparison instead of once when the TreeMap is constructed.
Instead I propose that this check only take place once in the constructors
and the rest of the code assume that a comparator exists. If a comparator is
not provided then you can simply define one as follows:

comparator = new Comparator()
{
  @SuppressWarnings("unchecked")
  public int compare(K first, K second)
  {
return ((Comparable) first).compareTo(second);
  }
});

This solution should be backwards compatible while improving performance. At
least, that's my guess. There is always the chance that the JIT is smart
enough to optimize away this comparison but I'd rather not rely on JIT
implementation details. I also believe the resulting code is more readable.

What do you think?
-- 
View this message in context: 
http://www.nabble.com/Proposal-for-improving-performance-of-TreeMap-and-others-tp14673283p14673283.html
Sent from the OpenJDK Core Libraries mailing list archive at Nabble.com.



Re: RMI benchmark

2008-01-07 Thread Peter Jones
On Sat, Jan 05, 2008 at 08:01:35PM +0800, zhang Jackie wrote:
> Hi, everyone!
>   Recently ,I want to have a performance comparision on RMI and my own
> version with little changes. Can you give me some microbenchmarks and some
> other suites used for estimate the performance of RMI? I googled the keyword
> "RMI benchmark", but cant get one.

You can find a microbenchmark suite for RMI, as well as object
serialization, in the "test" tree of the jdk7/jdk repository--
relative to the jdk7 forest, look here:

jdk/test/java/rmi/reliability/benchmark

There isn't much documentation there, but the script here:

jdk/test/java/rmi/reliability/scripts/create_benchmark_jars.ksh

shows how to create two JAR files, rmibench.jar and serialbench.jar
from the sources.  Running either JAR with "-h" prints a usage
message.  Each has a default config file that can be altered to
customize which of the microbenchmarks to execute, how many
repetitions of each to run, how much warmup to do, etc.  The RMI suite
can be run all in one VM or in two separate VMs, a "client" and a
"server", possibly on different hosts.

-- Peter


Re: Performance regression in java.util.zip.Deflater

2008-01-07 Thread Alan Bateman

Clemens Eisserer wrote:

:

PS: The striding+GetPrimitive... is even used by NIO for copying
java-arrays into direct-ByteBuffers:
while (length > 0) {
size = (length > MBYTE ? MBYTE : length);
GETCRITICAL(bytes, env, dst);
memcpy(bytes + dstPos, (void *)srcAddr, size);
RELEASECRITICAL(bytes, env, dst, 0);

  
Yes, NIO uses JNI critical sections when copying to/from arrays, but as 
a FYI, we hope to eliminate this native code soon. The replacement uses 
the Unsafe interface to do the copying and will be much faster than the 
current native implementation. To allow for safepoint polling (in the 
VM) it also copies very large arrays/buffers in strides.


-Alan.