[ 
https://issues.apache.org/jira/browse/HADOOP-10143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13878062#comment-13878062
 ] 

Hudson commented on HADOOP-10143:
---------------------------------

SUCCESS: Integrated in Hadoop-trunk-Commit #5031 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/5031/])
Move HADOOP-10143 to 2.4 section of CHANGES.txt (wang: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1560222)
* /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt


> replace WritableFactories's hashmap with ConcurrentHashMap
> ----------------------------------------------------------
>
>                 Key: HADOOP-10143
>                 URL: https://issues.apache.org/jira/browse/HADOOP-10143
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: io
>    Affects Versions: 2.0.0-alpha, 2.2.0
>            Reporter: Liang Xie
>            Assignee: Liang Xie
>             Fix For: 3.0.0, 2.4.0
>
>         Attachments: HADOOP-10143.txt
>
>
> We observed a lock contend hotspot from a HBase cluster:
> "IPC Reader 9 on port 12600" daemon prio=10 tid=0x00007f85b8aceed0 nid=0x4be8 
> waiting for monitor entry [0x00007f8501c57000]
>    java.lang.Thread.State: BLOCKED (on object monitor)
>         at 
> org.apache.hadoop.io.WritableFactories.getFactory(WritableFactories.java:44)
>         - locked <0x00000007fd1328a8> (a java.lang.Class for 
> org.apache.hadoop.io.WritableFactories)
>         at 
> org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:680)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:586)
>         at 
> org.apache.hadoop.hbase.client.MultiAction.readFields(MultiAction.java:116)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:682)
>         at 
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:126)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processData(SecureServer.java:618)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processOneRpc(SecureServer.java:596)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.saslReadAndProcess(SecureServer.java:362)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.readAndProcess(SecureServer.java:492)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:770)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:561)
>         - locked <0x000000043da3fea0> (a 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:536)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> "IPC Reader 7 on port 12600" daemon prio=10 tid=0x00007f85b8a99df0 nid=0x4be6 
> waiting for monitor entry [0x00007f8501cd9000]
>    java.lang.Thread.State: BLOCKED (on object monitor)
>         at 
> org.apache.hadoop.io.WritableFactories.getFactory(WritableFactories.java:44)
>         - locked <0x00000007fd1328a8> (a java.lang.Class for 
> org.apache.hadoop.io.WritableFactories)
>         at 
> org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:680)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:586)
>         at 
> org.apache.hadoop.hbase.client.MultiAction.readFields(MultiAction.java:116)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:682)
>         at 
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:126)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processData(SecureServer.java:618)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processOneRpc(SecureServer.java:596)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.saslReadAndProcess(SecureServer.java:362)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.readAndProcess(SecureServer.java:492)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:770)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:561)
>         - locked <0x000000043da232e8> (a 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:536)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> "IPC Reader 5 on port 12600" daemon prio=10 tid=0x00007f85b8a64d40 nid=0x4be2 
> runnable [0x00007f8501d5b000]
>    java.lang.Thread.State: RUNNABLE
>         at 
> org.apache.hadoop.io.WritableFactories.getFactory(WritableFactories.java:44)
>         - locked <0x00000007fd1328a8> (a java.lang.Class for 
> org.apache.hadoop.io.WritableFactories)
>         at 
> org.apache.hadoop.io.WritableFactories.newInstance(WritableFactories.java:49)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:680)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:586)
>         at org.apache.hadoop.hbase.client.Action.readFields(Action.java:103)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:682)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:586)
>         at 
> org.apache.hadoop.hbase.client.MultiAction.readFields(MultiAction.java:116)
>         at 
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:682)
>         at 
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:126)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processData(SecureServer.java:618)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.processOneRpc(SecureServer.java:596)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.saslReadAndProcess(SecureServer.java:362)
>         at 
> org.apache.hadoop.hbase.ipc.SecureServer$SecureConnection.readAndProcess(SecureServer.java:492)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:770)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:561)
>         - locked <0x000000043da27300> (a 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader)
>         at 
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:536)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> All those three threads just wanted to get/read the factory, so to me,  it 
> looks like a perfect use case for ConcurrentHashMap here.



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)

Reply via email to