On 08/26/2013 10:10 PM, Paul Sandoz wrote:
On Aug 25, 2013, at 8:04 PM, Remi Forax <fo...@univ-mlv.fr> wrote:

On 08/21/2013 02:25 PM, Paul Sandoz wrote:
Hi,

Here are Doug's Linked/HashMap changes, discussed in a previous thread, as a 
webrev:

   
http://cr.openjdk.java.net/~psandoz/tl/JDK-8023463-Linked-HashMap-bin-and-tree/webrev/

I also added some tests related to characteristics associated with fixing 
another bug.

Looking at the diffs will be tricky given there are so many changes.

The code can be simplified a little bit by using the diamond syntax,
HashMap (lines: 919, 963, 1026, 1497, 1569) and
LinkedHashMap (lines: 255, 264, 270, 278)

There are a bunch of @Override that are missing making the code harder than it 
should to read.

Yes, i think this is because it sticks to the 166 style i suspect. Easy to 
change.


HashMapSpliterator.est should be renamed to estimateSize.

All occurences of "(n - 1) & hash" should be replaced by "hash & (n - 1)" 
because it's easier to explain.

I am OK with how they are, and it's consistent with other code.

Ok, not a big deal.



TreeNode.getTreeNode() should be a static method and take a Node<K,V> as first 
argument,
so most of the casts to TreeNode<K,V> that bother you will disappear :)
  static TreeNode<K,V> getTreeNode(Node<K,V> node, int h, Object k) {
    TreeNode<K,V> first = (TreeNode<K,V)node;
    return ((first.parent != null) ? first.root() : firt).find(h, k, null);
  }

Yes, that will reduce the casts.


In putVal, line 654, the null check (e != null) makes the method hard to read,
at least I think a comment saying that it means that an existing node need to 
be altered is needed.

e.g.:

             if (e != null) { // existing mapping for key

Yes, perfect.



And I still don't like the way the method clone() is implemented (the older 
implementation has the same issue),
result should not be initialized to null and the catch clause should thow an 
InternalError or an AssertionError,

You mean like this:

     public Object clone() {
         HashMap<K,V> result = null;
         try {
             result = (HashMap<K,V>)super.clone();
         } catch (CloneNotSupportedException e) {
             // this shouldn't happen, since we are Cloneable
             throw new InternalError(e);
         }
         result.reinitialize();
         result.putMapEntries(this, false);
         return result;
     }

Yes, and in that case, you don't need to initialize result to null (the first line), because when result is used (result.reinitialize()) result is already initialized by the return value of super.clone().

[... ]



I fixed unchecked warnings in LinkedHashMap, but have not done so for HashMap, 
where there are many more casts from Node to TreeNode. One way to solve that is 
with a static method:

     @SuppressWarnings("unchecked")
     static <K, V> TreeNode<K, V> asTreeNode(Node<K, V> n) {
         return (TreeNode<K, V>) n;
     }

but i dunno what the performance implications are. Perhaps it's best to simply 
stuff @SuppressWarnings on methods of TreeNode rather than trying to modify 
code just for the sake of reducing the scope.
You should not need the @SuppressWarnings here?

Doh, clearly i am not thinking straight here and got confused over unchecked!

Paul.

regards,
RĂ©mi

Reply via email to