Hi,

I am testing our application(similar to "personalised page rank" using Pregel, 
and note that each vertex property will need pretty much more space to store 
after new iteration), it works correctly on small graph.(we have one single 
machine, 8 cores, 16G memory)

But when we ran it on larger graph(e.g. LiveJouranl), it always end at the 
error "GC overhead limit exceeded", even the partitions number is increased to 
48 from 8.

The existing memory setting in spark-env.sh:
SPARK_DAEMON_JAVA_OPTS='-Xms1g -Xmx10g -XX:MaxPermSize=1g'
SPARK_MEM=12g
export SPARK_DAEMON_JAVA_OPTS
export SPARK_MEM


Is there any improvements I could do? 


Thanks in advance!


Best,
Yifan

Reply via email to