-- then
crashes. Is there any thing I can do when processing large sequence files
other than increasing available heap?
Arni
On Wed, Sep 2, 2015 at 5:08 PM, Arni Sumarlidason <sumarlida...@gmail.com>
wrote:
> I'm having problems getting my data reduced evenly across nodes.
>
> ->
I'm having problems getting my data reduced evenly across nodes.
-> map a 200,000 line single text file and output <0L,line>
-> custom partitioner returning static member i++%numPartitions in an
attempt to distribute each line to as many reducers as possible
-> reduce; I end up with 13 or 18