...@gmail.com
Reply-To: user@hadoop.apache.org
Date: Thursday, 17 July 2014 16:15
To: user@hadoop.apache.org
Subject: Re: Configuration set up questions - Container killed on
request. Exit code is 143
Another thing to try is smaller input splits if your data can be broken
up
into smaller
chris.maw...@gmail.com
Reply-To: user@hadoop.apache.org
Date: Thursday, 17 July 2014 16:15
To: user@hadoop.apache.org
Subject: Re: Configuration set up questions - Container killed on
request. Exit code is 143
Another thing to try is smaller input splits if your data can be broken up
into smaller
: Re: Configuration set up questions - Container killed on
request. Exit code is 143
Another thing to try is smaller input splits if your data can be broken up
into smaller files that can be independently processed. That way s
you get more but smaller map tasks. You could also use more
Date: Thursday, 17 July 2014 13:36
To: Chris MacKenzie stu...@chrismackenziephotography.co.uk
Cc: user@hadoop.apache.org
Subject: Re: Configuration set up questions - Container killed on
request. Exit code is 143
Hi Chris MacKenzie, I have a feeling (I am not familiar with the kind
Cc: user@hadoop.apache.org
Subject: Re: Configuration set up questions - Container killed on
request. Exit code is 143
Hi Chris MacKenzie, I have a feeling (I am not familiar with the kind
of work you are doing) that your application is memory intensive. 8 cores
per node and only 12GB
Hi,
Thanks Chris Mawata
I’m working through this myself, but wondered if anyone could point me in
the right direction.
I have attached my configs.
I’m using hadoop 2.41
My system is:
32 Clusters
8 processors per machine
12 gb ram
Available disk space per node 890 gb
This is my current error: