by following the detailed given in [1] and
updating my memory configurations.
[1]http://hortonworks.com/blog/how-to-plan-and-configure-yarn-in-hdp-2-0/
Thanks,
Charith
On Tue, Dec 9, 2014 at 10:14 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi,
I m facing an OOM Exception when
Hi,
I m facing an OOM Exception when running a Giraph Job. Worker logs gives
following exception
014-12-09 21:57:43,451 INFO [communication thread]
org.apache.hadoop.mapred.Task: Communication exception:
java.lang.OutOfMemoryError: Java heap space
at
Hi,
Is https://issues.apache.org/jira/browse/GIRAPH-52 fixed in this version?
Or is there a work around.
I think it is a blocker.
Thanks,
Charith
On Thu, Nov 13, 2014 at 1:45 PM, Maja Kabiljo majakabi...@fb.com wrote:
+1, thanks Roman!
From: Claudio Martella claudio.marte...@gmail.com
Hi,
I am running Apache Giraph 1.1.0 in Hadoop 2.2.0 as an
mapreduce application. But I could not find the Giraph logs.
It will be great if someone could tell me how to enable Apache giraph
logging.
Also, I see that group collects very detailed runtime statistics, how can I
collect those
Hi,
I have a trouble when trying to run giraph trunk on Hadoop 2.2.0
Job terminated, giving the exception
ERROR yarn.GiraphYarnClient: Giraph:
org.apache.giraph.examples.SimpleShortestPathsComputation
reports FAILED state, diagnostics show: Application
application_1415138324209_0002 failed 2
)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 13 more
It will be great if someone could assist me in this matter.
Thanks,
Charith
On Tue, Nov 4, 2014 at 2:19 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi,
I have a trouble when trying to run giraph trunk
Hi Folks,
I'm wondering what is the resource allocation model for Apache Giraph.
As I understand each worker is one to one Mapped with a Mapper and a worker
can process multiple partitions with a user defined number of threads.
Is it possible to make sure that one worker, only process a single
your hadoop configuration to
control the maximum number of workers assigned to one machine (optimally
one with multiple threads).
On Thu, Oct 23, 2014 at 5:53 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi Folks,
I'm wondering what is the resource allocation model
to get some outside input inside
the WorkerGraphPartitioner?In my case it will be an hdfs file location.
Thanks,
Charith
On Sat, Sep 27, 2014 at 10:13 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi,
I m trying to use giraph with a custom graph partitioner
Hi Tamer,
The reason you see this behavior is IntIntNullTextInputFormat sets the
value of the vertex as same as the vertex id when creating a vertex. Since
you do not change the value vertex id will be written to the output as the
vertex value.
See the class
Hope it helps
Olivier
Le 1 oct. 2014 à 07:57, Charith Wickramarachchi
charith.dhanus...@gmail.com a écrit :
Hi,
I am trying to pass some system options into the giraph job so that I
can access it through the Giraph configuration.
I am using the following command
$HADOOP_HOME/bin
Hi,
I am trying to pass some system options into the giraph job so that I can
access it through the Giraph configuration.
I am using the following command
$HADOOP_HOME/bin/hadoop jar
giraph-examples-1.1.0-SNAPSHOT-for-hadoop-2.2.0-jar-with-dependencies.jar
org.apache.giraph.GiraphRunner
this by my own
org.apache.giraph.partition.WorkerGraphPartitioner implementation.
But my question is, Is there are a way to get some outside input inside
the WorkerGraphPartitioner?In my case it will be an hdfs file location.
Thanks,
Charith
On Sat, Sep 27, 2014 at 10:13 PM, Charith
the WorkerGraphPartitioner?In my case it will be an hdfs file location.
Thanks,
Charith
On Sat, Sep 27, 2014 at 10:13 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi,
I m trying to use giraph with a custom graph partitioner that I have. In
my case i want
Hi,
I m trying to use giraph with a custom graph partitioner that I have. In my
case i want to assign vertices to workers based on a custom partitioner
input.
In my case partitioner will take number of workers as an input parameter
and give me a file which maps each vertex id to a worker. I m
it will be an hdfs file location.
Thanks,
Charith
On Sat, Sep 27, 2014 at 10:13 PM, Charith Wickramarachchi
charith.dhanus...@gmail.com wrote:
Hi,
I m trying to use giraph with a custom graph partitioner that I have. In
my case i want to assign vertices to workers based on a custom partitioner
16 matches
Mail list logo