Try using setNumMapTasks in JobConf.

Though it is only a hint to the framework and doesn't guarantee the number of 
jobs.

 Morpheus: Do you believe in fate, Neo?
Neo: No.
Morpheus: Why Not?
Neo: Because I don't like the idea that I'm not in control of my life.



----- Original Message ----
From: Starry SHI <starr...@gmail.com>
To: core-u...@hadoop.apache.org; core-...@hadoop.apache.org; 
common-user@hadoop.apache.org; common-...@hadoop.apache.org
Sent: Sun, December 20, 2009 11:09:40 PM
Subject: Why I can only run 2 map/reduce task at a time?

Hi,

I am currently using hadoop 0.19.2 to run large data processing. But I
noticed when the job is launched, there are only two map/reduce tasks
running in the very beginning. after one heartbeat (5sec), another two
map/reduce task is started. I want to ask how I can increase the map/reduce
slots?

In the configuration file, I have already set
"mapred.tasktracker.map(reduce).tasks.maximum" to 10, and
"mapred.map(reduce).tasks" to 10. But there are still only 2 are launched.

Eager to hear from your solutions!

Best regards,
Starry

/* Tomorrow is another day. So is today. */



      

Reply via email to