you can control that by property

<property>

<name>mapred.reduce.slowstart.completed.maps</name>

  <value>0.05</value>

  <description>Fraction of the number of maps in the job which should be

  complete before reduces are scheduled for the job.

  </description>

</property>
set the value of this property to 1. It will start the reducers when all the
maps will finish.
This will work with hadoop version 0.20.0.

On Fri, Sep 25, 2009 at 8:39 PM, Steve Loughran <ste...@apache.org> wrote:

> Oliver Senn wrote:
>
>> Hi,
>>
>> Thanks for your answer.
>>
>> I used these parameters. But they seem to limit only the number of
>> parallel maps and parallel reduces separately. They do not prevent the
>> scheduler from schedule one map and one reduce on the same task tracker in
>> parallel.
>>
>> But that's the problem I'm trying to solve. Having at most one task
>> running on a task tracker at any time (never one map and one reduce together
>> on one task tracker).
>>
>>
> You could do your own scheduler, if you really can't handle the things in
> parallel
>



-- 
Thanks & Regards,
Chandra Prakash Bhagtani,

Reply via email to