Hi everyone,

I was running a job with outOfCore option and Giraph get stuck.

The RAM is 96G and the dataset is 20G. The server has 16 physical cores.  I
was running PageRank with the following command

/usr/local/hadoop/bin/hadoop jar
/usr/local/giraph/giraph-examples/target/giraph-examples-1.1.0-SNAPSHOT-for-hadoop-1.2.1-jar-with-dependencies.jar
org.apache.giraph.GiraphRunner -D giraph.useOutOfCoreGraph=true -D
giraph.useOutofCoreMessages=true -D giraph.maxPartitionsInMemory=50 -D
giraph.maxMessagesInMemory=500000000 -Dgiraph.zkMinSessionTimeout=9000000
-Dgiraph.zkMaxSessionTimeout=9000000 -Dgiraph.zkSessionMsecTimeout=9000000
org.apache.examples.PageRank -eif
org.apache.giraph.io.formats.IntNullTextEdgeInputFormat -eip
/user/hduser/xx.txt -w 14

It get stuck with:

14/07/28 22:05:58 INFO job.JobProgressTracker: Data from 14 workers -
Compute superstep 1: 38731088 out of 41652230 vertices computed; 182 out of
196 partitions computed; min free memory on worker 4 - 768.51MB, average
1296.96MB
and then was killed.

I check the mem, it uses nearly 100% of the memory.

Does anyone know the reason?

Thanks!

Reply via email to