On 14/10/12 00:17, Ben Mabey wrote:
I switched from pmap to
(r/fold n (r/monoid into vector) conj coll))
and the same thing happened again!
after approximately 50 minutes cpu utilisation dropped from 4/4 to
1/4...I don't understand!
Jim
Are you holding on to the head of the collection?
Jim foo.bar jimpil1...@gmail.com writes:
So you can see, this is perfect for pmap and indeed it seems to be
doing extremely well but only for the first 240 papers roughly! all
the cpus are working hard but after approximately 30-40 min cpu
utilisation and overall performance seems to degrade
On 13/10/12 12:50, Tassilo Horn wrote:
pmap might not be as perfect for you use-case as you think. Because it
is lazy and chunked, it won't keep you cores busy. Some time ago, there
was a detailed thread about this on this list.
But it does keep my cores fully busy (99-100%) for the first 40
On 13/10/12 12:55, Jim - FooBar(); wrote:
On 13/10/12 12:50, Tassilo Horn wrote:
pmap might not be as perfect for you use-case as you think. Because it
is lazy and chunked, it won't keep you cores busy. Some time ago, there
was a detailed thread about this on this list.
But it does keep my
This feels like a memory leak problem. Try adding
-XX:-HeapDumpOnOutOfMemoryError. You might try lowering the max heap to
try and force an OOM earlier. Heap dumps can be analyzed using a variety
of tools. My favorite is eclipse.org/mat
On Saturday, October 13, 2012 8:16:54 AM UTC-7, Jim
If it is a memory / gc problem, the easiest way to spot it is writing a gc
log.
Add the following parameters to your JVM parameters:
-Xloggc:D:/log/myGClog.log
-XX:+PrintGCDetails
Then, use a GC log viewer to keep a look on the logs (by refreshing or tail
refresh) with a GC log viewer. These
I switched from pmap to
(r/fold n (r/monoid into vector) conj coll))
and the same thing happened again!
after approximately 50 minutes cpu utilisation dropped from 4/4 to
1/4...I don't understand!
Jim
Are you holding on to the head of the collection? That is often the
source of memory
Hi all,
I finally found an ideal use-case for pmap, however something very
strange seems to be happening after roughly 30 minutes of execution!
Ok so here is the scenario:
I've got 383 raw scienific papers (.txt) in directory that i'm grouping
using 'file-seq' and so I want to pmap a fn on
Have you tried running jconsole to monitor the memory usage? It sounds
like maybe you're running out of heap space and you're mainly seeing the
garbage collector doing it's thing vs your actual program.
~Adam~
On Fri, Oct 12, 2012 at 9:23 AM, Jim foo.bar jimpil1...@gmail.com wrote:
Hi all,
No i haven't profiled memory , only cpu but what you're saying makes
perfect sense. In every single iteration (each file) I'm 'slurp'-ing the
2 files (the dictionary and the file to annotate) provided. Would you
suggest a different GC if that is the case or simply stop slurping?
Jim
On
If you have the memory you could just increase the heap size to a higher
value (like -Xmx2048m or something). But even if you do that I would still
run jconsole to see what's happening.
~Adam~
On Fri, Oct 12, 2012 at 9:41 AM, Jim foo.bar jimpil1...@gmail.com wrote:
No i haven't profiled
11 matches
Mail list logo