Hi Arun,

 I would suggest you to check two things -

1. Does your code run successfully for some small data?
2. Have you modified hadoop-env.sh and mapred-site.xml file for all 6
clusters?

And it is showing  "Caused by: java.util.concurrent.

ExecutionException: java.lang.OutOfMemoryError: Java heap space"


This means there is something in your code that is being modified and
accessed simultaneously at runtime.

e.g. if you are iterating a hashset and removing or adding contents to it,
it will give such kind of exceptions.


Regards,

Agrta Rawat


On Mon, Apr 14, 2014 at 7:02 PM, Arun Kumar <toga...@gmail.com> wrote:

> Hello,
>
> I am trying to run triangle count program in giraph using a hadoop cluster
> of 6 machines each having 16 gb of ram. My data set is of  size 1 gb which
> is of json format.
>
> The issue is I am getting  OutOfMemoryError  while trying to run this. For
> data which is up to 25 thousand line of code it runs fine.I have updated
>
> Hadoop-env.sh with export HADOOP_HEAPSIZE=2000 and mapred-site.xml with
> <name>mapred.child.java.opts</name>
>    <value>-Xmx3000m -XX:+UseConcMarkSweepGC</value>
> and the number of map instance is max 3  so for each machine  will occupy
> 13 gb
>
> But even after doing all this things I am getting the below exception,
>
> Error from attempt_201404140400_0001_m_000011_0:
> java.lang.IllegalStateException: run: Caught an unrecoverable exception
> waitFor: ExecutionException occurred while waiting for
> org.apache.giraph.utils.ProgressableUtils$FutureWaitable@3e3a2536
>         at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:102)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.IllegalStateException: waitFor: ExecutionException
> occurred while waiting for
> org.apache.giraph.utils.ProgressableUtils$FutureWaitable@3e3a2536
>         at
> org.apache.giraph.utils.ProgressableUtils.waitFor(ProgressableUtils.java:151)
>         at
> org.apache.giraph.utils.ProgressableUtils.waitForever(ProgressableUtils.java:111)
>         at
> org.apache.giraph.utils.ProgressableUtils.getFutureResult(ProgressableUtils.java:73)
>         at
> org.apache.giraph.utils.ProgressableUtils.getResultsWithNCallables(ProgressableUtils.java:192)
>         at
> org.apache.giraph.graph.GraphTaskManager.processGraphPartitions(GraphTaskManager.java:753)
>         at
> org.apache.giraph.graph.GraphTaskManager.execute(GraphTaskManager.java:273)
>         at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:92)
>         ... 7 more
> Caused by: java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Java heap space
>         at
> java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:232)
>         at java.util.concurrent.FutureTask.get(FutureTask.java:91)
>         at
> org.apache.giraph.utils.ProgressableUtils$FutureWaitable.waitFor(ProgressableUtils.java:271)
>         at
> org.apache.giraph.utils.ProgressableUtils.waitFor(ProgressableUtils.java:143)
>         ... 13 more
> Caused by: java.lang.OutOfMemoryError: Java heap space
>
> Thanks,
>
> Arun
>

Reply via email to