We did a successful setup of hadoop-0.20.203.0 and hive-0.7.1 -- Very old
versions .
You may want to move to hadoop 1.2.0 and hive-0.11 to have few more things.

If you want to limit how many maps a particular job or particular user can
occupy, take a look at hadoop's fair scheduler can capacity schedulers.
If you want change the number of mapper and reducers for a hive query, you
can change the input min and max split sizes as said by Rajesh.

Here are few of the settings
set mapred.min.split.size=1024000;
set mapred.max.split.size=4096000;
set hive.exec.reducers.max=2048;

may be once you tell clearly what you want achieve we will be able to help
better.



On Wed, Oct 9, 2013 at 6:09 PM, Rajesh Balamohan <rajesh.balamo...@gmail.com
> wrote:

> Did you try adjusting fileinputformat. Min and max size parameters?
> On Oct 9, 2013 5:51 PM, "Garg, Rinku" <rinku.g...@fisglobal.com> wrote:
>
>>  Hi All****
>>
>> We did a successful setup of hadoop-0.20.203.0 and hive-0.7.1.  We have
>> the following query:****
>>
>> ** **
>>
>> Is there any option in Hive where mappers can be reduced/fixed?****
>>
>> ** **
>>
>> For example if there are 8 mapping tasks defined in Hadoop configuration,
>> then is it possible to use 4 hive setups that uses 2 mappers each?****
>>
>> ** **
>>
>> Thanks & Regards,****
>>
>> Rinku Garg
>>
>>
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>  _____________
>> The information contained in this message is proprietary and/or
>> confidential. If you are not the intended recipient, please: (i) delete the
>> message and all copies; (ii) do not disclose, distribute or use the message
>> in any manner; and (iii) notify the sender immediately. In addition, please
>> be aware that any message addressed to our domain is subject to archiving
>> and review by persons other than the intended recipient. Thank you.
>>
>


-- 
Nitin Pawar

Reply via email to