Hi,

I am currently using hadoop 0.19.2 to run large data processing. But I
noticed when the job is launched, there are only two map/reduce tasks
running in the very beginning. after one heartbeat (5sec), another two
map/reduce task is started. I want to ask how I can increase the map/reduce
slots?

In the configuration file, I have already set
"mapred.tasktracker.map(reduce).tasks.maximum" to 10, and
"mapred.map(reduce).tasks" to 10. But there are still only 2 are launched.

Eager to hear from your solutions!

Best regards,
Starry

/* Tomorrow is another day. So is today. */

Reply via email to