➜  util git:(trunk) ✗ jstack 75623
2014-04-06 20:42:34
Full thread dump Java HotSpot(TM) 64-Bit Server VM (23.21-b01 mixed mode):

"Attach Listener" daemon prio=10 tid=0x00007f1760001000 nid=0x135cd
waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"DestroyJavaVM" prio=10 tid=0x00007f2cb8009800 nid=0x12768 waiting on
condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"pool-1-thread-1" prio=10 tid=0x00007f2cb81c1800 nid=0x12788 waiting
on condition [0x00007f175f6cc000]
   java.lang.Thread.State: WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for  <0x00007f18ee497038> (a
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1079)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:807)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)

"Service Thread" daemon prio=10 tid=0x00007f2cb8119000 nid=0x12786
runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread1" daemon prio=10 tid=0x00007f2cb8117000 nid=0x12785
waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" daemon prio=10 tid=0x00007f2cb8114000 nid=0x12784
waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Signal Dispatcher" daemon prio=10 tid=0x00007f2cb8112000 nid=0x12783
runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Finalizer" daemon prio=10 tid=0x00007f2cb80c4800 nid=0x12782 in
Object.wait() [0x00007f17746a0000]
   java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x00007f18b4bd0b50> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
- locked <0x00007f18b4bd0b50> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:189)

"Reference Handler" daemon prio=10 tid=0x00007f2cb80c2000 nid=0x12781
in Object.wait() [0x00007f17747a1000]
   java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x00007f18b4bd7738> (a java.lang.ref.Reference$Lock)
at java.lang.Object.wait(Object.java:503)
at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:133)
- locked <0x00007f18b4bd7738> (a java.lang.ref.Reference$Lock)

"VM Thread" prio=10 tid=0x00007f2cb80ba800 nid=0x12780 runnable

"GC task thread#0 (ParallelGC)" prio=10 tid=0x00007f2cb8017800
nid=0x12769 runnable

"GC task thread#1 (ParallelGC)" prio=10 tid=0x00007f2cb8019000
nid=0x1276a runnable

"GC task thread#2 (ParallelGC)" prio=10 tid=0x00007f2cb801b000
nid=0x1276b runnable

"GC task thread#3 (ParallelGC)" prio=10 tid=0x00007f2cb801d000
nid=0x1276c runnable

"GC task thread#4 (ParallelGC)" prio=10 tid=0x00007f2cb801e800
nid=0x1276d runnable

"GC task thread#5 (ParallelGC)" prio=10 tid=0x00007f2cb8020800
nid=0x1276e runnable

"GC task thread#6 (ParallelGC)" prio=10 tid=0x00007f2cb8022800
nid=0x1276f runnable

"GC task thread#7 (ParallelGC)" prio=10 tid=0x00007f2cb8024000
nid=0x12770 runnable

"GC task thread#8 (ParallelGC)" prio=10 tid=0x00007f2cb8026000
nid=0x12771 runnable

"GC task thread#9 (ParallelGC)" prio=10 tid=0x00007f2cb8028000
nid=0x12772 runnable

"GC task thread#10 (ParallelGC)" prio=10 tid=0x00007f2cb8029800
nid=0x12773 runnable

"GC task thread#11 (ParallelGC)" prio=10 tid=0x00007f2cb802b800
nid=0x12774 runnable

"GC task thread#12 (ParallelGC)" prio=10 tid=0x00007f2cb802d800
nid=0x12775 runnable

"GC task thread#13 (ParallelGC)" prio=10 tid=0x00007f2cb802f000
nid=0x12776 runnable

"GC task thread#14 (ParallelGC)" prio=10 tid=0x00007f2cb8031000
nid=0x12777 runnable

"GC task thread#15 (ParallelGC)" prio=10 tid=0x00007f2cb8033000
nid=0x12778 runnable

"GC task thread#16 (ParallelGC)" prio=10 tid=0x00007f2cb8035000
nid=0x12779 runnable

"GC task thread#17 (ParallelGC)" prio=10 tid=0x00007f2cb8036800
nid=0x1277a runnable

"GC task thread#18 (ParallelGC)" prio=10 tid=0x00007f2cb8038800
nid=0x1277b runnable

"GC task thread#19 (ParallelGC)" prio=10 tid=0x00007f2cb803a800
nid=0x1277c runnable

"GC task thread#20 (ParallelGC)" prio=10 tid=0x00007f2cb803c000
nid=0x1277d runnable

"GC task thread#21 (ParallelGC)" prio=10 tid=0x00007f2cb803e000
nid=0x1277e runnable

"GC task thread#22 (ParallelGC)" prio=10 tid=0x00007f2cb8040000
nid=0x1277f runnable

"VM Periodic Task Thread" prio=10 tid=0x00007f2cb8124000 nid=0x12787
waiting on condition

JNI global references: 130

On Mon, Apr 7, 2014 at 2:50 AM, Uwe Schindler <[email protected]> wrote:
> Hi Benson,
>
> there must be another thread that sits on this lock:
> - parking to wait for  <0x00007f18ee497038>
> But the stack trace you have shown has nothing to do with Lucene! This looks 
> like one of the normal threads always waiting for some external trigger (they 
> are used by the garbage collector). Could this be that one: 
> https://issues.apache.org/jira/browse/LUCENE-5573
>
> So it would be better to get the *full* stack trace of all threads.
>
> Uwe
>
> -----
> Uwe Schindler
> H.-H.-Meier-Allee 63, D-28213 Bremen
> http://www.thetaphi.de
> eMail: [email protected]
>
>
>> -----Original Message-----
>> From: Benson Margulies [mailto:[email protected]]
>> Sent: Monday, April 07, 2014 2:41 AM
>> To: [email protected]
>> Subject: I may have run into something interesting with luceneutil
>>
>> Or I may not.
>>
>> https://code.google.com/a/apache-
>> extras.org/p/luceneutil/wiki/AddToBuildTree?ts=1396830970&updated=Add
>> ToBuildTree
>>
>> I'm trying to learn something about direct posting format using luceneutil.
>>
>> The above-linked page is what I'm trying on a 160G multicore machine.
>>
>> Using trunk, the SearchPerfTest process seems to be stuck.
>>
>> top shows a memory size of 60g -- not even the full 80 I gave it.
>>
>> No CPU is being consumed.
>>
>> No significant I/O from iostat.
>>
>> strace shows no activity.
>>
>> jstack is completely boring except for the one thread shown below.
>>
>> Anyone got any ideas?
>>
>>
>> "pool-1-thread-1" prio=10 tid=0x00007f2cb81c1800 nid=0x12788 waiting on
>> condition [0x00007f175f6cc000]
>>    java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native
>> Method)
>> - parking to wait for  <0x00007f18ee497038> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>> at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
>> at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.aw
>> ait(AbstractQueuedSynchronizer.java:2043)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.tak
>> e(ScheduledThreadPoolExecutor.java:1079)
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.tak
>> e(ScheduledThreadPoolExecutor.java:807)
>> at
>> java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1
>> 068)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav
>> a:1130)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja
>> va:615)
>> at java.lang.Thread.run(Thread.java:722)
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected] For additional
>> commands, e-mail: [email protected]
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to