Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Harsh J
You can set the amount of memory used by the reducer using the >>>>>> mapreduce.reduce.java.opts property. Set it in mapred-site.xml or >>>>>> override it in your job. You can set it to something like: -Xm512M to >>>>>> increase the amount of memory used

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
Thank you for the hint. I'm fairly new to this so nothing is well known to me at this time ;-) -K On Wed, Feb 16, 2011 at 1:58 PM, Rahul Jain wrote: > If you google for such memory failures, you'll find the mapreduce tunable > that'll help you: > > mapred.job.shuffle.input.buffer.percent ; it i

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Rahul Jain
file? > >>>>> > >>>>> -K > >>>>> > >>>>> On Wed, Feb 16, 2011 at 9:43 AM, Jim Falgout < > jim.falg...@pervasive.com> wrote: > >>>>>> You can set the amount of memory used by the reducer using the &g

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread James Seigel
e? >>>>> >>>>> -K >>>>> >>>>> On Wed, Feb 16, 2011 at 9:43 AM, Jim Falgout >>>>> wrote: >>>>>> You can set the amount of memory used by the reducer using the >>>>>> mapreduce.reduce.java.opts pro

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
n your job. You can set it to something like: -Xm512M to >>>>> increase the amount of memory used by the JVM spawned for the reducer >>>>> task. >>>>> >>>>> -Original Message- >>>>> From: Kelly Burkhart [mailt

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread James Seigel
t the amount of memory used by the reducer using the >>>> mapreduce.reduce.java.opts property. Set it in mapred-site.xml or override >>>> it in your job. You can set it to something like: -Xm512M to increase the >>>> amount of memory used by the JVM spawned for th

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
e >>> amount of memory used by the JVM spawned for the reducer task. >>> >>> -Original Message- >>> From: Kelly Burkhart [mailto:kelly.burkh...@gmail.com] >>> Sent: Wednesday, February 16, 2011 9:12 AM >>> To: common-user@hadoop.apache.or

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread James Seigel
;> amount of memory used by the JVM spawned for the reducer task. >> >> -Original Message- >> From: Kelly Burkhart [mailto:kelly.burkh...@gmail.com] >> Sent: Wednesday, February 16, 2011 9:12 AM >> To: common-user@hadoop.apache.org >> Subject: Re: Reduce java.la

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
job. You can set it to something like: -Xm512M to increase the amount > of memory used by the JVM spawned for the reducer task. > > -Original Message- > From: Kelly Burkhart [mailto:kelly.burkh...@gmail.com] > Sent: Wednesday, February 16, 2011 9:12 AM > To: common-user@ha

RE: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Jim Falgout
Message- From: Kelly Burkhart [mailto:kelly.burkh...@gmail.com] Sent: Wednesday, February 16, 2011 9:12 AM To: common-user@hadoop.apache.org Subject: Re: Reduce java.lang.OutOfMemoryError I have had it fail with a single reducer and with 100 reducers. Ultimately it needs to be funneled to a single

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread James Seigel
...oh sorry I didn't scroll below the exception the first time. Try part 2 James Sent from my mobile. Please excuse the typos. On 2011-02-16, at 8:00 AM, Kelly Burkhart wrote: > Hello, I'm seeing frequent fails in reduce jobs with errors similar to this: > > > 2011-02-15 15:21:10,163 INFO org.

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread real great..
another possibility could be increasing the memory allocated to jvm..not sure how to do it though. On Wed, Feb 16, 2011 at 8:46 PM, James Seigel wrote: > Well the first thing I'd ask to see (if we can) is the code or a > description of what your reducer is doing. > > If it is holding on to objec

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread James Seigel
Well the first thing I'd ask to see (if we can) is the code or a description of what your reducer is doing. If it is holding on to objects too long or accumulating lists well then with the right amount of data you will run OOM. Another thought is that you've just not allocated enough mem for the

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
I have had it fail with a single reducer and with 100 reducers. Ultimately it needs to be funneled to a single reducer though. -K On Wed, Feb 16, 2011 at 9:02 AM, real great.. wrote: > Hi, > How many reducers are you using currently? > Try increasing the number or reducers. > Let me know if it h

Re: Reduce java.lang.OutOfMemoryError

2011-02-16 Thread real great..
Hi, How many reducers are you using currently? Try increasing the number or reducers. Let me know if it helps. On Wed, Feb 16, 2011 at 8:30 PM, Kelly Burkhart wrote: > Hello, I'm seeing frequent fails in reduce jobs with errors similar to > this: > > > 2011-02-15 15:21:10,163 INFO org.apache.hado

Reduce java.lang.OutOfMemoryError

2011-02-16 Thread Kelly Burkhart
Hello, I'm seeing frequent fails in reduce jobs with errors similar to this: 2011-02-15 15:21:10,163 INFO org.apache.hadoop.mapred.ReduceTask: header: attempt_201102081823_0175_m_002153_0, compressed len: 172492, decompressed len: 172488 2011-02-15 15:21:10,163 FATAL org.apache.hadoop.mapred.Task