[ 
https://issues.apache.org/jira/browse/HADOOP-960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12468726
 ] 

Doug Cutting commented on HADOOP-960:
-------------------------------------

> It's "usually" better to take the extra maps. Unfortunately, I've got one of 
> those ones where it isn't better. :) 

Then subclass your InputFormat and override the getSplits() method.

> This is now a feature request instead of a bug.

The feature's there: you can precisely control splitting if you like.  E.g., 
Nutch does this when crawling.  Is that not sufficient for you?

> Incorrect number of map tasks when there are multiple input files
> -----------------------------------------------------------------
>
>                 Key: HADOOP-960
>                 URL: https://issues.apache.org/jira/browse/HADOOP-960
>             Project: Hadoop
>          Issue Type: Wish
>    Affects Versions: 0.10.1
>            Reporter: Andrew McNabb
>
> This problem happens with hadoop-streaming and possibly elsewhere.  If there 
> are 5 input files, it will create 130 map tasks, even if 
> mapred.map.tasks=128.  The number of map tasks is incorrectly set to a 
> multiple of the number of files.  (I wrote a much more complete bug report, 
> but Jira lost it when it had an error, so I'm not in the mood to write it all 
> again)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to