I guess you're right. It is probably as likely that the JIT will optimize
away the null check as it is that it will optimize away the
NullPointerException check. One exception, though, is when production
systems run using -Xverify:none. In such a case, wouldn't my approach run
faster?

I still think that my proposed code is somehow more consistent/cleaner on a
design-level but I guess that's just me :)

As an aside, are there standard benchmarks for testing the impact of this
change? I'd love to know whether it actually produces any performance
difference in practice.

Gili



Martin Buchholz wrote:
> 
> The authors of TreeMap have thought about
> eliding comparator null checks:
> 
> 
>     /**
>      * Version of getEntry using comparator. Split off from getEntry
>      * for performance. (This is not worth doing for most methods,
>      * that are less dependent on comparator performance, but is
>      * worthwhile here.)
>      */
>     final Entry<K,V> getEntryUsingComparator(Object key) {
>       K k = (K) key;
>         Comparator<? super K> cpr = comparator;
>         if (cpr != null) {
>             Entry<K,V> p = root;
>             while (p != null) {
>                 int cmp = cpr.compare(k, p.key);
>                 if (cmp < 0)
>                     p = p.left;
>                 else if (cmp > 0)
>                     p = p.right;
>                 else
>                     return p;
>             }
>         }
>         return null;
>     }
> 
> As to whether using an explicit Comparator for the "natural ordering"
> is a performance improvement, this depends very much on
> the implementation of the JIT and the degree of polymorphism of
> the call site, and on the prevalance of TreeMaps using "natural
> ordering".  At the very least, a null check is very cheap, so it is
> unlikely that the proposed change will be a significant performance
> improvement, while, on the other hand, there is a good chance that
> it will decrease performance for TreeMaps using "natural ordering".
> 
> Aside: It's probably a good idea for the comparator for
> "natural ordering" to be available via some static method.
> 
> Martin
> 
> 
> cowwoc wrote:
>> I noticed that TreeMap (and maybe other classes) require a user to either
>> pass in a Comparator or ensure that all keys must implement Comparable.
>> The
>> TreeMap code then uses a utility method whenever it needs to compare two
>> keys:
>> 
>> 
>> /**
>>  * Compares two keys using the correct comparison method for this
>> TreeMap.
>>  */
>> final int compare(Object k1, Object k2) {
>>   return comparator == null ? ((Comparable<? super  K>) k1)
>>   .compareTo((K) k2) : comparator.compare((K) k1, (K) k2);
>> }
>> 
>> The problem with the above method is that it checks whether comparator is
>> null once per comparison instead of once when the TreeMap is constructed.
>> Instead I propose that this check only take place once in the
>> constructors
>> and the rest of the code assume that a comparator exists. If a comparator
>> is
>> not provided then you can simply define one as follows:
>> 
>> comparator = new Comparator<K>()
>>     {
>>       @SuppressWarnings("unchecked")
>>       public int compare(K first, K second)
>>       {
>>         return ((Comparable<K>) first).compareTo(second);
>>       }
>>     });
>> 
>> This solution should be backwards compatible while improving performance.
>> At
>> least, that's my guess. There is always the chance that the JIT is smart
>> enough to optimize away this comparison but I'd rather not rely on JIT
>> implementation details. I also believe the resulting code is more
>> readable.
>> 
>> What do you think?
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Proposal-for-improving-performance-of-TreeMap-and-others-tp14673283p14676918.html
Sent from the OpenJDK Core Libraries mailing list archive at Nabble.com.

Reply via email to