memory problem

2012-09-13 Thread Franco Maria Nardini
Hi all, I am running the PageRank code on single machine hadoop installation. In particular, I am running the code using four workers on a graph of 100.000 nodes. I am getting this exception: 012-09-13 22:54:38,379 INFO org.apache.hadoop.mapred.Task: Communication exception: java.lang.OutOfMemory

Re: Can Giraph handle graphs with very large number of edges per vertex?

2012-09-13 Thread Maja Kabiljo
Hi Jeyendran, As Paolo mentioned, there were two patches to deal with out-of-core: GIRAPH-249 for out-of-core graph GIRAPH-45 for out-of-core messages For the graph part, currently assumption is that you have enough memory to keep at least one whole partition at the time. Options you need to set

Re: Can Giraph handle graphs with very large number of edges per vertex?

2012-09-13 Thread Paolo Castagna
Hi Jeyendran, interesting questions and IMHO it is not always easy to understand how many Giraph workers are necessary in order to process a specific (large) graph. A few more comments inline, but I am interested in the answers to your questions as well. On 13 September 2012 07:03, Jeyendran Balak