Hi,
I have a custom pagerank computation with inputs reading from Hbase and
writing to it.
I submit my job on a real distributed Hadoop cluster which can allocate
320 map jobs. I started my job with 100 workers. What I see is that only
one of the workers are actually reading the input and
Unfortunately there's some restrictions that means I don't really have
them handy, BUT pointing me towards the local disk helped me partially
resolve it. There are rights issues with this directory, but I was
able to get by it by manually creating a separate giraph in mapreduce
local storage
Actually, I take that back. It seems it does succeeded in creating
partitions - it just struggles with it sometimes. Should I be worried about
these errors if partition directories seem to be filling up?
On Sep 11, 2013 6:38 PM, Claudio Martella claudio.marte...@gmail.com
wrote:
Giraph does not
Actually, why is it saying it fails to create directory in the first place,
when it is trying to write files?
On Sep 12, 2013 3:04 PM, Alexander Asplund alexaspl...@gmail.com wrote:
I can also add that there is no such issue with DiskBackedMessageStore. It
successfully creates a large number of