There must be only 2 input splits being produced for your job.
Either you have 2 unsplitable files, or the input file(s) you have are not
large enough compared to the block size to be split.

Table 6-1 in chapter 06 gives a breakdown of all of the configuration
parameters that affect split size in hadoop 0.19. Alphas are available :)

This is detailed in my book in ch06

On Tue, Apr 21, 2009 at 5:07 PM, javateck javateck <javat...@gmail.com>wrote:

> anyone knows why setting *mapred.tasktracker.map.tasks.maximum* not
> working?
> I set it to 10, but still see only 2 map tasks running when running one job
>



-- 
Alpha Chapters of my book on Hadoop are available
http://www.apress.com/book/view/9781430219422

Reply via email to