I used the new API. I just tried the job.setNumReduceTasks() method and it seems to be working.
Thanks Alex and Ed for helping me! On Thu, Oct 21, 2010 at 2:32 PM, Alex Kozlov <ale...@cloudera.com> wrote: > It looks like you do not pass the Configuration object correctly. Do you > use old or new (mapreduce) API? Do you have something like > > Job job = new Job(conf, "My job with " + conf.get("mapred.reduce.tasks") + > " > reducers"); > > to create the job? Is it OK to share you job creation code? > > Alex K > > On Thu, Oct 21, 2010 at 2:25 PM, Matt Tanquary <matt.tanqu...@gmail.com > >wrote: > > > Hi Alex, > > > > Yes, I confirmed from those locations that the job is setting the > reducers > > to 1. > > > > Thanks > > > > On Thu, Oct 21, 2010 at 1:45 PM, Alex Kozlov <ale...@cloudera.com> > wrote: > > > > > Hi Matt, it might be that the parameter does not end up in the final > > > configuration for a number of reasons. Can you check the job config > xml > > in > > > jt:/var/log/hadoop/history or in the JT UI and see what the > > > mapred.reduce.tasks setting is? -- Alex K > > > > > > On Thu, Oct 21, 2010 at 1:39 PM, Matt Tanquary < > matt.tanqu...@gmail.com > > > >wrote: > > > > > > > I am using the following to set my number of reduce tasks, however > when > > I > > > > run my job it's always using just 1 reducer. > > > > > > > > conf.setInt("mapred.reduce.tasks", 20); > > > > > > > > 1 reducer will never finish this job. Please help me to understand > why > > > the > > > > setting I choose is not used. > > > > > > > > Thanks, > > > > -M@ > > > > > > > > > > > > > > > -- > > Have you thanked a teacher today? ---> http://www.liftateacher.org > > > -- Have you thanked a teacher today? ---> http://www.liftateacher.org